专利摘要:
COLIMATION DISPLAY WITH PIXEL LENSES.A monitor assembly (515) includes: (a) an image display system (10) that includes a series of pixels (120) and (b) a series of pixel lenses (115). The pixel lens series has each lens positioned to collimate or substantially collimate the light from a corresponding single pixel in the pixel series. The monitor mount (515) is used on a head mounted display device (500) that includes a frame (510) to support the monitor mount (515) at a desired distance from the user's eyes. The head mounted display apparatus may also include a beam splitter (520) to reflect images from the monitor assembly (515) in the user's eyes. The head-mounted display device can provide a wide field of view for the user and can be of the augmented or immersive reality type.
公开号:BR112013014975A2
申请号:R112013014975-2
申请日:2011-12-15
公开日:2020-08-11
发明作者:Gregory A. Harrison;David Alan Smith
申请人:Lockheed Martin Corporation;
IPC主号:
专利说明:

Descriptive Report of the Invention Patent for "DISPLAY OF COLIMATION WITH PIXEL LENSES".
CROSS REFERENCE TO RELATED ORDERS This order claims priority for US Provisional Order 5 serial number 61 / 423,934 (entitled PIXEL LENS COLLIMENT DISPLAY, filed December 16, 201 0), US Provisional Order serial number 61 /424,162 (entitled APPROACH WITH PIXEL LENSES FOR A MONITOR MOUNTED ON THE HEAD WITH INCREASED REALLY, deposited on December 17, 2010), and Provisional Order 10 US serial number 61 / 424.166 (entitled APPROACH WITH PIXEL LENSES FOR MOUNTED MONITOR IN THE IMMERSIVE HEAD, ·· "'filed on December 17, 2010), all of which are incorporated by reference into this document.
FIELD 15 This disclosure refers to a head-mounted display device and, in particular, a head-mounted display device that employs a series of lenses to collimate or partially collimate the light emitted from a display system. pixelated images (series lenses are referred to herein as "pixel lenses"). In certain embodiments, the apparatus also employs one or more reflective optical surfaces, for example, one or more free-space, ultra-wide-angle reflective optical surfaces (hereinafter referred to as "FS / UWAIRO surfaces"). In certain embodiments, the global optical system is a pupil-free formation system, that is, the control aperture (opening stop) 25 of the entire system is the pupil of the user's eye. Pixel lenses and, when used, one or more reflective surfaces (for example, one or more FS / UWAIRO surfaces) are used to display images from a light emission display system kept in close proximity to a user’s eye. BACKGROUND A head-mounted monitor such as a helmet-mounted monitor or glasses-mounted monitor (abbreviated in the present
te document as an "HMD") is a display device worn over an individual's head that has one or more small display devices located close to an eye or, more commonly, both eyes of the user. 5 Some HMDs monitors simulated only images (computer generated), unlike real-world images and, consequently, are often referred to as "virtual reality" or immersive HMDs.
Other HMDs superimpose (combine) a simulated image with a non-simulated real-world image.
The combination of non-simulated and simulated 1O images allows the user of HMOs to view the world through, for example, a sight or optical instrument in which the additional data relevant to the task to be performed is superimposed in the straight ahead field of view (FOV ) of user.
The overlap is sometimes referred to as "augmented reality" or "mixed reality". The combination of a non-simulated real-world view with a simulated image can be achieved using a partially reflective / partially transmissive optical surface (a "beam splitter") where the surface reflectivity is used to display a simulated image as a virtual image (in the optical sense) and the transmissivity of the surface is used to allow the user to view the real world directly (referred to as an "optical perception system"). The combination of a real-world view with a simulated image can also be done electronically by accepting a video of a real-world view from a camera and mixing it electronically with a simulated image using a combiner (referred to as a "video perception" system). The combined image can then be presented to the user as a virtual image (in the optical sense) by means of a reflective optical surface which, in this case, has no transmissive properties.
From the above, it can be seen that reflective optical surfaces can be used in HMDs that provide the user with: (i) a combination of a simulated image and a non-simulated real-world image, (ii) a combination of an image simulated and a real-world video image, or (iii) purely simulated images (the latter case is often referred to as an "immersive" system). In each of these cases, the reflective optical surface produces a virtual image (in the optimal sense) that is viewed by the user.
Historically, such reflective optical surfaces are part of optical systems whose output pupils have not only the dynamic field of view available to the user, but also the static field of view.
Specifically, to see the image produced by the optical system, the user needed to align his eye with the exit pupil of the optical system and keep it aligned, and even then, the image visible to the user might not cover the entire the user's complete static field of view, that is, the previous optical systems used in HMDs that employed reflective optical surfaces had part of the pupil formation systems and thus had limited output pupils.
The reason that systems are so limited is the fundamental fact that the field of human vision is remarkably wide.
Thus, the static field of view of a human eye, including both foveal and peripheral vision of the eye, is in the order of ~ 150 ° in the horizontal direction and in the order of ~ 130 ° in the vertical direction. (For the purposes of this disclosure, 150 degrees will be used as the static field for the straight front of a nominal human eye). Well-corrected optical systems with output pupils capable of accommodating such a wide static field of view are few and rare, and when they do exist they are expensive and bulky.
In addition, the operational field of view of the human eye (dynamic field of view) is even greater since the eye can rotate around its center of rotation, that is, the human brain can consider the foveal field of view + of the human eye in different directions, changing the direction of the eye.
For a nominal eye, the vertical range of motion is in the order of ~ 0 ° upward and ~ 60 ° downward and the horizontal range of motion is in the order of ± ~ 50 ° from the front.
For an exit pupil of the size produced by the types of optical systems previously used in HMDs, even a small rotation of the eye can substantially reduce the overlap that existed between the static field of view of the eye and the exit pupil and larger rotations can. make the image disappear completely.
Although theoretically possible, an exit pupil that can move in sync with the user's eye is not practical and can be prohibitively expensive. 5 In view of these properties of the human eye, there are three fields of view that are relevant in terms of providing an optimal system that allows a user to view an image generated by an image display system in the same way that he / she could see the natural world.
The smallest of these three fields of vision is defined by the user's ability to rotate his eye and thus sweep his fovea across the outside world.
The maximum rotation is in the order of ± 50 ° from the front, so this field of view (the dynamic foveal field of view) is approximately 100 °. In the middle of the three fields of view is the static field of view in the straight front and includes both the user's foveal and peripheral vision.
As discussed above, this field of view (the static foveal + peripheral field of view) is in the order of 150 °, for example, ~ 168 °. The broadest of the three fields of vision is defined by the user's ability to rotate their eye and thus sweep their foveal vision plus their peripheral vision throughout the outside world.
Based on a maximum rotation on the order of ± 50 ° and a static foveal + peripheral field of view on the order of 150 °, this wider field of view (the dynamic foveal + peripheral field of view) is on the order of 200 °. This increasing scale of fields of view from at least 100 degrees to at least 150 degrees and then at least 200 degrees provides the benefits corresponding to the user in terms of his ability to view images generated by an image display system in an intuitive or natural way.
In order for the human eye to easily focus on a monitor that is within 25.40 in (10 inches) of the eye, a form of collimation needs to be applied to the rays of light emanating from the monitor.
Collimation serves to make the rays of light appear as if they originated from a greater distance than the current distance between the eye and the monitor.
The greater apparent distance, in turn, allows the eye to readily focus on an image from the monitor. Some overhead monitors use thick level mirrors, lenses or prisms in an attempt to collimate light from the monitor. These approaches add volume and weight, making such head-mounted monitors more inconvenient and heavier than desired. Also, because the approaches seek to collimate the light from all pixels as a group, they both lack the ability to control collimation on a pixel-by-pixel basis as they tend to introduce optical aberrations into the resulting collimated light beam. . Thus, there is a need for monitors mounted on the head that are compatible with the ability to focus as well as at least one dynamic foveal field of view of the human eye. The present disclosure addresses these needs and provides head-mounted monitors that produce collimated (or substantially collimated) light over a wide field of view.
DEFINITIONS In the remainder of this disclosure and in the claims, the term "virtual image" is used in its optical sense, that is, a virtual image is an image that is perceived to come from a particular place where in fact the light that is perceived is not originates there. Throughout the disclosure, the following expressions / terms will have the following meanings / scope: (1) The term "a reflective optical surface" (also referred to herein as a "reflective surface" or a "reflector") will include a surface (whether flat, curved, continuous or composed of spatially separated parts) that is only reflective as well as a surface that is both reflective and transmissive. In any case, the reflectivity can be only partial, that is, part of the incident light can be transmitted across the surface. Likewise, when the surface is both reflective and transmissive, the reflectivity and / or the transmissivity can be partial. As discussed below, in certain embodiments, a single reflective optical surface can be used for both eyes or each eye can have its own individual reflective optical surface.
Other variations include the use of multiple reflective optical surfaces for both eyes or individually for each eye.
Combinations of mixing and comparison can also be used, for example, a single reflective optical surface can be used for one eye and multiple reflective optical surfaces for the other eye.
As an alternative, one or multiple reflective optical surfaces can be provided for only one eye of the user.
The claims described below are intended to cover these and other configurations of reflective optical surfaces.
In particular, each claim that calls for a reflective optical surface is intended to cover a head-mounted display device that includes at least such a surface. (2) The term "an image display system having a light emitting surface comprising a series of light emitting pixels" (also referred to herein as an "image display system" or a " display system ") is generally used to include any system that has a pixelated surface (whether flat, curved, continuous or composed of spatially separated parts) that emits light to form a perceptible human image or by transmitting light across the surface , light generation on the surface (for example, by a series of LEDs), reflection off the light surface from another source, or the like.
The system can employ one or multiple image display devices, for example, one or multiple series of LED, OLED and / or LCD.
As with reflective optical surfaces, a single image display system can be used for both eyes or each eye can have its own individual image display system.
Other variations include the use of multiple image display systems for both eyes or individually for each eye.
Mixing and comparison combinations can also be used, for example, a single image display system can be used for one eye and multiple image display systems for the other eye.
As an alternative, one or multiple image display systems can be provided for only one eye of the user.
The claims described below are intended to cover these and other image display system configurations. In particular, each claim that calls for an image display system that has a light-emitting surface comprising a series of light-emitting pixels is intended to cover the head-mounted display apparatus 5 which includes at least least that system. (3) The term "monitor assembly" refers to the combination of an image display system and a series of pixel lenses on the light-emitting side of the image display system. (4) The term "binocular viewer" means a 1O device that includes at least one separate optical element (for example, a display device and / or a reflective optical surface) for each eye. (5) The expression "field of view" and its abbreviation FOV refer to the "apparent" field of view in the image space (eye) as opposed to the "real" field of view in the object space (ie is, monitor). (6) The term "substantially collimated" depends on the particular application of the technology described in this document, but in general, light from a pixel of light emission is "substantially collimated" if its vergence in the eye user is greater than -1, O dioptric. For reference, a 25 m spot source has a -0.04 diopter vergence and so if a pixel lens or a combination of a pixel lens and a curved reflective optical surface (when used) causes the light from a pixel appears to a user to come from a distance of 25 m just as light can have a droplet -0.04 droplet in the user's eye that is greater than -1.0, that is, less negative than -1, 0, and so that light can be considered substantially collimated. For another reference, the light emitted from an image display system without any collimation can have a vergence of approximately 3 in the order of -33 dioptric.
SUMMARY According to a first aspect, a display device mounted on the head is disclosed, which includes: (I) a frame adapted to be mounted on a user's head; (11) an image display system having a light emitting surface comprising a series of light emitting pixels, the image display system being supported by the frame (for example, the frame supports the image display system image in a fixed location that, when using the HMD, is out of the user's field of vision); and (111) a reflective optical surface supported by the frame (for example, the reflective optical surface can be a continuous surface that is not rotationally symmetrical (not a revolution surface) around 1O any coordinate axis of a system of three-dimensional Cartesian coordinates, for example, the reflective optical surface can be a free-space, ultra-wide-angle reflective optical surface that is not rotatable symmetry around the x, y, or z axes of a coordinate system Cartesian origin having an arbitrary origin); where: (a) the device includes a series of pixel lenses located between the series of light emitting pixels and the reflective optical surface, a pixel lens of each of the light emitting pixels, the pixel lens being aligned with and receiving light from its associated light emission pixel when using the device; and (b) the series of pixel lenses either alone or in combination with the reflective optical surface collimates or substantially collimates the light emitted from the light emitting pixel series during use of the apparatus.
According to a second aspect, a display device mounted on the head is disclosed, which includes: (I) a frame adapted to be mounted on a user's head; (11) an image display system having an optimal reflective surface comprising a series of light emitting pixels, the image display system being supported by the frame; and (111) a free-space reflective optical surface, with an ultra-wide angle supported by the frame;
where: (a) the device includes a series of pixel lenses located between the series of light-emitting pixels and the ultra-wide-angle, free-space reflective optical surface, one pixel lens for each one of the light emitting pixels 5, the pixel lens being aligned with and receiving light from its associated light emitting pixel during use of the apparatus; and (b) during the use of the device, the ultra-wide angle free-space reflective optical surface and the series of pixel lenses produce virtual images spatially separated from the spatially separated portions of the emission surface. light, at least one of the spatially separated virtual images being angularly separated from at least another of the virtual images spatially separated by at least 100 degrees (in some modalities, at least 150 degrees and, in other modalities, at least 200 degrees) , the angular separation being measured from a center of rotation of a nominal user eye.
According to a third aspect, a display device mounted on the head is disclosed, which includes: (I) a frame adapted to be mounted on a user's head; and (11) a monitor mount supported by the frame, the monitor mount including: (a) an image display system having a light emitting surface comprising a series of light emitting pixels; and (b) a series of pixel lenses, a pixel lens for each of the light-emitting pixels, the pixel lens being aligned with and receiving light from its associated light-emitting pixel while using the device ; where during the use of the device, the series of pixel lenses is the only component of the device with optical power between the light-emitting surface and a user's eye.
According to a fourth aspect, a display device mounted on the head is disclosed, which includes:
(I) a frame adapted to be mounted on a user's head; and (11) an image display system supported by the frame: wherein (a) the image display system comprises a light emitting surface comprising a series of light emitting pixels; (b) the apparatus includes a series of spherical pixel lenses, a spherical pixel lens for each of the emis10O pixels are light, the series of spherical pixel lenses being located between the series of light emitting pixels and a user's eye while using the device.
In accordance with a fifth aspect, a method is disclosed that includes the steps of: generating an image by an image display system having a light emitting surface comprising a series of light emitting pixels; independently collimating or substantially collimating the light from each respective one light emitting pixel of the series of light emitting pixels by a respective pixel lens of a series of pixel lenses aligned with the series of light emitting pixels; providing collimated or substantially collimated light from the series of pixel lenses to a reflector positioned with respect to a user's eye; and reflect collimated or substantially collimated light from the reflector to the user's eye.
According to a sixth aspect, a method is disclosed that includes the steps of: (a) providing light from a series of light emitting pixels; (b) receiving the light produced by the series of light emitting pixels in a series of pixel lenses positioned so that the light from each light emitting pixel is collimated or substantially collimated by a corresponding pixel lens in the series pixel lenses; and (c) delivering collimated or substantially collimated light directly (i.e., without passing light through a field lens or other optical component having optical power) to a user's eye.
In various modalities, the devices and methods are characterized in that: (i) the reflective optical surface (when used) and the series of pixel lenses produce virtual images spatially separated from portions spatially separated from the light emitting surface , at least 1O of the spatially separated virtual images being angularly separated from at least other virtual images spatially separated by at least 100 degrees (in some modalities, at least 150 degrees and, in other modalities, at least 200 degrees), the angular separation being measured from the center of rotation of a nominal user eye; and (ii) at least one point on the reflective optical surface is angularly separated from at least another point on the reflective optical surface by at least 100 degrees (in some modalities, at least 150 degrees and, in other modalities, at least 200 degrees), the angular separation being measured from the center of rotation of a nominal user eye.
For these modalities, during use, at least one of the spatially separated virtual images can be located along a direction of the eye that passes through at least one point of the reflective optical surface and at least one of the spatially separated virtual images. It is located along a direction of the eye that passes through at least another point on the reflective optical surface.
In various modalities, a separate series of pixel lenses, a separate image display system, and / or a separate reflective optical surface (when used) is employed for each of the user's eyes.
In other modalities, the reflective optical surface (when used) contributes to the collimation (or substantial collimation) of light from the image display system provided by the pixel lens series, such a contribution to collimation (or substantial collimation) being obtained through the local radii of curvature of the surface. In various modalities, the HMD device can be a training system without a binocular pupil in which the eye is free to move around its center of rotation through all its angular extensions normally obtainable without being restricted to looking through a external pupil. Previous HMD devices claim that they have or can provide a broad view, but these devices have included an external pupil that the eye must look through. Although there is a large amount of information provided to the eye, if the eye turns, the information disappears. This is a fundamental problem with pupil formation systems that is avoided in the modalities of this disclosure that employ reflective surfaces and, in particular, FS / UWA / RO surfaces. It should be understood that both the above general description and the following detailed description are merely exemplary of the invention and are intended to provide an overview or structure for understanding the nature and character of the invention. Additional features and advantages of the invention are described in the detailed description below and, in part, will be readily apparent to those skilled in the art from the description or recognized by the practitioner of the invention as exemplified by the description in this document. The accompanying drawings are included to provide another understanding of the invention and are incorporated into and form a part of this report. It should be understood that several features of the invention disclosed in this report and the drawings can be used in any and all combinations. Similarly, the various limitations of the claims can be used in any and all combinations.
BRIEF DESCRIPTION OF THE DRAWINGS Figure 1 is a partial block representation of a monitor mount that includes a collimation pixel lens according to an exemplary embodiment. Figure 2 is a block representation of a monitor assembly that includes a collimating pixel lens for each pixel according to an exemplary embodiment.
Figure 3 is a perspective view of a monitor assembly that includes a collimating pixel lens for each pixel according to an exemplary embodiment.
Figure 4 is a diagram of light rays being collimated by 5 a pixel lens according to an exemplary embodiment.
Figure 5 is a diagram of light rays being collimated by an alternative pixel lens according to an exemplary embodiment.
Figure 6 is a side view of a display device mounted on the head with augmented reality having pixel lenses in accordance with 10 with an exemplary modality.
Figure 7 is a front view of a user using the augmented reality head mounted display device in figure 6. Figure 8 is a diagram illustrating light paths for the augmented reality head mounted display device. of figures 6 and 7. Figure 9 is a ray diagram illustrating rays of light for an augmented reality head mounted display apparatus having pixel lenses in accordance with an exemplary embodiment.
Figure 10O is a ray diagram illustrating rays of light penetrating an eyeball according to an exemplary modality.
Figure 11 is a top view of a display device mounted on the head illustrating the use of two curved reflective optical surfaces corresponding to a user's two eyes according to an exemplary modality.
Figure 12 is a schematic diagram illustrating a static field of view of a nominal human eye for a straight look direction.
Figure 13 is a schematic diagram illustrating the interaction between the static field of view in Figure 12 and an FS / UWA / RO surface according to an exemplary embodiment.
The arrows in figure 13 illustrate the directions of light propagation.
Figure 14 is a ray diagram illustrating a light path from a given pixel on a monitor as it is reflected towards an eye according to an exemplary modality. Figure 15 is a schematic diagram illustrating the geometry for calculating a normal location for a reflective surface according to an exemplary embodiment. Figure 16 is a ray diagram illustrating light paths starting from two pixels on a monitor as they are reflected towards an eye according to an exemplary modality. Figure 17 is a diagram illustrating the variables used in the selection of the direction of the local normal of a reflector according to an exemplary mode. Figure 18 is a representation of a curved reflector along with light paths according to an exemplary modality. Figure 19 is a side view of a display device mounted on the head with augmented reality having pixel lenses in accordance with an exemplary modality. Figure 20 is a side view of a display device mounted on the immersive head having pixel lenses in accordance with an exemplary model. Figure 21 is a ray diagram illustrating ray paths for a display device mounted on the immersive head having pixel lenses according to an exemplary embodiment. Figure 22 is a diagram illustrating light paths for a display device mounted on the immersive head having pixel lenses according to an exemplary embodiment.
DETAILED DESCRIPTION I. Introduction As discussed above, the present disclosure refers to HMDs that provide a user with a collimated (or substantially collimated) image through the use of pixel lenses. Pixel lenses may be the only source of collimation in the optical system or, in modalities that employ a curved reflective optical surface, for example, a
FS / UWA / RO surface, the collimation provided by pixel lenses can be combined with the collimation contributed by the curved reflective optical surface.
Generally speaking, in certain embodiments, the HMD image display system 5 is placed directly in front of the eye for use as an immersive display device.
In other embodiments, the HMD includes a flat or curved beam splitter to reflect lighting from the image display system to produce an augmented reality display device.
If desired, reflective modes can be used to produce an immersive display device making the reflective surface non-transmissive.
The following discussion begins with a description of non-limiting examples of pixel lens modalities that can be used in the HMDs disclosed herein (Section 11) and then proceeds to a discussion of HMDs that employ at least one curved reflective optical surface, including HMDs that employ at least one FS / UWA / RO surface (Section 111). Section 111 also includes a discussion of the design process for an FS / UWA / RO surface that is used in an optical system that includes pixel lenses.
After Section 111, modalities that employ a reflective optical surface that is not a curved surface are discussed (Section IV), followed by modalities in which an image display system is viewed directly without using a reflective surface ( Section V). Finally, a general discussion applicable to the various modalities disclosed in this document is presented (Section VI). It should be understood that the discussions of the various components of the HMDs that appear in the particular sections of the presentation are not limited to the modalities of that section, but are generally applicable to all modalities disclosed in this document.
As an example, the descriptions of the types of pixel lenses and image display systems that can be used in an HMD that appear in Sections 11 and 111 are applicable to all modalities disclosed in this document.
to.
11. Pixel Lenses As discussed above and in more detail below, pixel lenses perform the collimation (or partial collimation) of the light emitted by an image display system in order to allow visualization of the system when located nearby to a user's eye without introducing substantial volume or weight into the HMD. Having a single lens per pixel reduces the size of the required monitor optics, and eliminates the distortion that can result from collimating with just a single mirror or large lens. In particular, pixel lenses do not cause large field aberrations in the images produced by the monitor because they process only one pixel at a time. In addition, a large series of pixel lenses can be provided to allow a field of view as wide as desired by the user. In particular, in certain embodiments, the pixel lens technology allows the display of a monitor screen, such as a screen similar to a cell phone monitor, in close proximity, but stretched out across a surface. reflective. If desired, each pixel lens can vary independently based on the location of its associated pixel with respect to the user's eye. Figure 1 is a partial block diagram of an image delivery system 10 and its associated series of pixel lenses 15 according to an exemplary embodiment. The image display system 1O comprises a computer output or a projection surface 25 and, in this embodiment, an image projection assembly 40. Surface 25 includes a plurality or series of light emitting pixels ( for example, pixels 120 of figures 2 and 3). The 1O image display system produces text, graphic or video images (hereinafter referred to as an "image") that can be perceived by the human eye. The plurality or series of light emitting pixels and the image projection assembly 40 can be manufactured using liquid crystal display (LCD) technology, light emitting diode (LED) technology, light emitting diode technology organic light emission (OLED), gas plasma technology, optimal fiber beam technology
or other image projection technology now known or subsequently revealed.
Associated with the 1O image display system is a plurality or series of 15 pixel lenses on or in a substrate 30. Each pixel lens 15 is of a small size, for example, in the 5 micrometer range (IJm) , and is configured to be aligned with a single pixel of light emission from the image display system so that the light emanating from the pixel is collimated or partially collimated to facilitate viewing the images formed by the image display system in a narrow range. 1O Depending on the particulars of the 1O image display system, the system will generally include multiple layers, films and / or substrates with filters, emitters, etc., to produce an image displayed on the light emitting surface of the system (a system screen) for viewing by a user, as is well established.
In one embodiment, each pixel of light emission is controlled by pixel control information such as red, green and blue (RGB) data that correspond to the color intensities of a particular pixel.
The 1O image display system can receive RGB input data or other information from a graphic source (for example, camera 540 in figure 6). RGB data is used to drive the row and column trigger circuits or other means to control the pixels to display the image in a way that is observable by the human eye.
In one embodiment, the 10O image display system may include a flat panel monitor screen.
In other embodiments, the image display system may include a curved monitor screen.
In operation, the image display system is built to control the light at the pixel level.
In particular, the intensity of each light emitting pixel can vary independently based on its location with respect to the eye and / or its associated pixel lens.
In this way, the image produced by the display system can have substantially uniform intensity across the field of view during viewing close to the eye.
In some modalities, the use of pixel lenses can simplify
another optics used in the head mounted display device.
For example, in some embodiments, lens or mirror constructions that can otherwise be used for collimation are no longer needed.
All the remaining optics could then be concerned only with the distribution of collimated pixel beams available through an appropriate field of view to provide a desired field of view to the user, from whatever other optics is necessary in the whole.
More optics may be needed in the case of augmented reality, but in this case, there may be no need to provide collimation, only the distribution of 10O light beams.
In all cases, there is very little or no chromatic aberration that could arise from having to pass light from the monitor through a thick refractive lens that manipulates all the pixels in a lens, and chromatic aberration does not occur in a reflector. .
Figures 2 and 3 show a partial representation of a monitor assembly 100 including a pixel lens 115 for each pixel 120 of an image display system.
By adjusting the collimation of a single pixel at a time, a set of collimated or partially collimated light beams is provided by the monitor assembly 100 which can then be manipulated with different light transmission and / or reflection devices without having to adjust the properties of diopter or collimation (or with less adjustment for the properties of diopter or collimation than might otherwise be necessary), thus fitting the construction of a display system close to the eye.
The display system close to the eye can be assembled from one, two or three centimeters (inches) from the eye, or the image displayed by the system can be reflected from a surface that is one, two or three centimeters ( inches) from the eye, as will be described in more detail below.
Each 115 pixel lens is constructed to provide an appropriate amount of collimation correction for how to close monitor assembly 100 is intended to be retained in the eye.
The conglomeration of all pixel lenses becomes a series of pixel lenses, one lens per pixel of light emission, combining together to form an image when taken together as a series.
The series will generally have many more pixel lenses than shown in figures 1 to 3, such as hundreds of thousands or millions of pixel lenses.
In the embodiment illustrated in figures 2 and 3, pixels 120 are supported on a substrate 125. Pixel lenses115 can be supported on or formed on a substrate 130 that allows light to pass through at least the areas which support pixel lenses 115. Substrate 130 and corresponding pixel lenses 115 are supported at a fixed distance from substrate 125 and pixels 120 in one embodiment.
In figure 2, spacers 121 are used to obtain the desired spacing between the pixel lens and the pixels.
Figure 3 shows an alternative embodiment in which substrate 130 is integrally formed with substrate 125 and is thick enough to provide the desired separation of pixels 120 and pixel lenses 115. While shown to be flat in figures 1 to 3, the monitor mount can be curved.
For example, in the case of a direct view HMD, the monitor assembly can be concave towards the user's eye.
As another example, in the case of an HMD that employs a reflective surface, the monitor mount can be convex, for example, cylindrical, towards the reflective surface to diffuse the individual pixel beams in a wider series and thus , provide a broader field of view.
When the light is completely collimated by a 115 pixel lens, photon radiation from a pixel 120 will travel to the eye in a narrow light pencil of approximately the same diameter as pixel 120. This reduces internal illumination lost to the viewer, makes the image bright, and makes it easier to see the light from the image display system or transmitted from the outside world.
Specifically, in a modality of a head-mounted display device employing pixel lenses, the center of the eye's bearing is in particular at a fixed location with respect to a particular pixel 120, and, through the use of a lens of pixel, pixel 120 can radiate its light in one direction, satisfying the eye's needs to look directly at pixel 120 or absorb pixel illumination as part of a wider field of view when the eye is pointing in a different direction.
Looked at in another way, because the light of each pixel 120 is collimated or substantially collimated, the amount of power required to display the same amount of light to the user is reduced as needed in the case of a system that does not use pixel lenses.
In the case of a system that does not use pixel lenses, the pixels generate light that is scattered in many directions that does not penetrate a user's eye and, unless the lost light is absorbed, it becomes a "noise from internal "light interfering with the 1O optical environment inside the head mounted display device.
So, in summary, pixel lenses are configured to focus light on a set of collimated (or substantially collimated) narrow beams, reducing lost fumes, and thus not requiring the extra power needed to generate fumes unused strangers.
In some modalities, no fixed relationship with the eyes is necessary, especially where the 115 pixel lenses are configured to completely collimate the light that will emerge from the light emitting pixels, in which case the pixels and the pixel lenses can be placed anywhere the eye can see them.
Partially collimated systems can be moved away from the eye by an amount that allows the user to see image, text, video, or other graphic information displayed in focus with or without acc substantial modification by the user's eye.
For example, having an image at a finite distance of, for example, 30 m, as opposed to infinite, the eye may be more relaxed (less accommodated) when viewing images.
In one embodiment, a television-style display system can be provided to the user, who can then view video images as if the images came from a distance, since the waves are collimated and, for example, they walk through a landscape without having to refocus their eyes.
The television display system can be placed anywhere in the user's field of vision.
If, for example, the monitor covers the entire field of view of the user, for example, because it is very close to the user's eye, then the monitor controls what the user can see and objects can be made to appear close, far away or intermediate.
In other modalities, if the display system is being used in an augmented reality vision system, the display system must be positioned in order to have images that appear in reality where the augmented reality system is designed to make with them to appear.
In some embodiments, there is no non-para- axial image distortion as with previous devices that process light from a monitor through a lens construction that accommodates all pixels.
Since the pixels are already differentiated for the smallest monitor unit that will be presented to the user, applying a corrective diopter lens (ie a 115 pixel lens) to the smallest unit alone prevents any non-para-axial aberrations and distortions otherwise associated with lens.
Also, once the light is collimated, the light path can be easily bent and directed with mirror assemblies since the mirror will no longer have to perform collimation functions as well.
In one embodiment, the diopter prescription for each 115 pixel lens can be set at a custom level.
In other embodiments, monitor assembly 100 can be constructed in a curved shape, with the output of each pixel lens 115 focused to reflect in specific directions when it contacts a mirror.
The use of a single pixel lens 115 corresponding exactly to a pixel 120 allows the creation of miniature monitors that can be placed directly on the eye and be seen clearly.
Pixel lenses 115 work directly with pixels 120 to correct the diopter of each pixel 120. Figure 4 shows rays 31 O having a great vergence emanating from a pixel 315 and penetrating a pixel lens 320 whose substrate of support 130 is located at a distance 01 from the pixel, where distance 01 is, for example, approximately 8 IJm.
The pixel lens
320 has a generally dome-shaped profile that is solid.
In other words, the surface that leaves the pixel lens 320 is curved and the surface that enters is essentially flat and integral with the substrate 130. The flat side of the pixel lens 320 has a diameter 02, where 02 is, for example, 5 approximately 8 J.Jm.
In this embodiment, the radius of curvature of the curved portion of lens 320 can be, for example, 5.45 J.Jm.
The lens outlet 320 from the curved outlet surface is collimated waves 325, suitable for viewing at close range by the human eye.
In other ways, the distance 01 from substrate 130 to pixels 315, and the sizes of lenses 320 and pixels 315, can vary along with the corresponding curvatures of the lenses to provide the desired light collimation.
The dimensions and curvature of the pixel lens can be determined in a number of ways, a convenient approach being to use an optical modeling program, such as ZEMAX, and adjust the parameters until a desired level of collimation is achieved.
Figure 5 is a diagram of light rays being collimated by an alternative 420 pixel lens according to an exemplary embodiment.
The rays 41 What emanate from a pixel 415, having a great vergence at that distance, introduce a pixel lens 420 in spherical shape which is situated at distance 03 which is, for example, approximately 3.3 J.Jm from the pixel 415. In this case, the diameter 04 of the lens 420 can be, for example, approximately 8.7 J.Jm.
The lens output 420 is collimated waves 425, suitable for viewing at close range.
The monitor assembly in one mode is composed of a series of these 415 pixels and lenses 420 in cylindrical shape.
The 420 pixel lens will normally be essentially solid.
Such a lens may be easier to manufacture in some embodiments, for example, as an integral unit with a substrate.
In one embodiment, lens 420 has a center in the middle of the middle plane 401 that is aligned with pixel 415 so that lens 420 is symmetrically placed in the path of rays 41 emanating from the pixel.
A series of such spherical lenses can be formed, with one lens following the other, each lens having a center C aligned with a respective pixel.
As with the pixel lens in figure 4, the pixel lens in figure 5 can be designed in several ways, a convenient approach being to use an optical software program and vary one or more parameters of the monitor assembly, for example, the location of the lens in cylindrical shape with respect to pixel 5 415, until a desired level of collimation is obtained.
Figures 4 and 5 illustrate two exemplary pixel lenses that can be used.
In other embodiments, the parameters of such lenses can vary significantly, and the distances from the pixels adjusted accordingly.
The density of pixels and corresponding lenses can also vary significantly, depending on the desired resolution of the monitor.
The series of pixel lenses can be produced in a variety of ways, such as through the use of various nano- and micro-fabrication techniques.
The lenses can be directly engraved on a transparent medium or created with polishing nanomachines.
Micro-replication through hot stamping can be used in some modalities.
Similar methods include microinjection molding, thermal microforming and nano printing.
Thin film technologies can be used in some modalities to manufacture the lenses.
For example, pixel lenses can be made of ethically deposited transparent semiconductor material that is controlled and applied using thin film semiconductor technology, among other means.
In other embodiments, injection molding can be used.
Direct chemical-optical recording as performed for semiconductors can be employed.
Nanomachine lens polishers can be used to create each lens in the series.
Custom polishing specifications can be applied to groups of pixel lenses and / or individual lenses.
In general terms, pixel lenses can be formed using the same types of manufacturing methods as used for creating display devices, for example, manufacturing methods of the type used to manufacture liquid crystal displays (LCDs), light emitting diodes (LEDs), organic light emitting diodes (OLEDs), or other image projection devices.
Using such techniques, the density
lenses can be adjusted for high definition monitors or lower resolution monitors as desired. Acrylic plastic (plexiglass) for diamond-oriented prototype parts can be used in one modality. For molded parts, either 5 acrylic or polycarbonate materials can be used, as an example. In general terms, small pixel lenses can be made from the same types of materials that are used to produce Fresnel lenses having similarly sized characteristics. As illustrated in the following sections, in various modalities, the combination of a series of pixel lenses and a series of light emitting pixels can be integrated into the display device mounted on the head in the form, for example, glasses, safety glasses, or other appropriate ways to hold the display device in a fixed relationship with one or both eyes of a user.
111. HMDs that employ a curved reflective optical surface As indicated above, HMDs that employ an optimally reflective surface and, in particular, a curved reflective optical surface, can be, for example, of the augmented reality type. In these modalities, the reflective optical surface will function as a beam splitter lens system that reflects an image formed by an image display system within the user's eye, while also allowing light from the outside world to penetrate the eye. The two images are aligned through the use of appropriate location equipment and software manipulation of computer generated images to allow virtual images to be apparently placed outdoors for the user to see. In one embodiment, the beam splitter lens system has controlled location mapping in the beam splitter lens system with directions for objects in an external environment. Such mapping is carried through to the pixels and made to be in alignment and register with the external environment at a high speed rate. Therefore, the movements of a user's head in different orientations with respect to
the external environment will cause images to be generated and displayed, which correctly increases the external environment in which the image is displayed in the correct apparent location in the environment by illuminating the correct reflective locations in the beam splitter lens system. 5 The surface of the display system and the curvature with respect to the diopter displacement provided by the pixel lenses can be manipulated in order to obtain approximately O dioptric in all directions for the image entering the eye from the monitor.
The amount of diopter change in each pixel lens and in the reflector of the lens system of the beam divider can also be adjusted as appropriate to support the head-mounted monitor design.
A flat spotlight will have no diopter change except with respect to the distance from the display system screen and the eye that changes diopter due to the distance alone, that is, the greater the distance from a source of light, the less the effective divergence of the light source and, thus, the distance alone can change the effective light range emanating from a monitor screen.
Thus, the distances from the reflector to the eye and the display system can also be adjusted to optimize the clarity of the image that is displayed by the display device mounted on the head.
Referring now to figures 6 and 7, these figures show, respectively, a side view and a front view of a head-mounted display device 500 shown in use by a user 505. The head-mounted display device employs a curved reflective optical surface, for example, an FS / UWAIRO 520 surface. In one embodiment, the display device mounted on the head 500 can be, for example, a binocular viewer through optical sight, with augmented reality.
Due to a binocular viewer through optics, with augmented reality being typically the most complex form of an HMD, the present disclosure will first discuss the modalities of this type, with the understanding that the principles discussed in this document are equally applicable to binocular viewers. through optical view, with augmented reality, monocular viewers
res and binoculars through optical view, with augmented reality, and monocular and binocular "virtual reality" systems.
As shown in figures 6 and 7, the head-mounted display device 500 includes a frame 510 adapted for use by the user and supported by the user's nose and ears in a similar manner to that in which glasses are worn.
In the form of figures 6-7, as well as in the other modalities disclosed in this document, the display device mounted on the head can have a variety of configurations and can, for example, resemble conventional glasses, spectacles, capes - 10 tes, and the like.
In figure 6, elements 550 and 555 represent various forms of support that in some modalities may be to retain the HMD frame in a desired position with respect to the user's eyes.
The support can, for example, be bands or strings that can be adjustable in some modalities.
In general terms, the outer surface of the HMD package can take any shape that retains the HMD optical system in the required orientation with respect to the user's eyes.
As shown in figures 6 and 7, the head-mounted display apparatus 500 includes (a) at least one image monitor assembly 515, which includes an image display system and a series of pixel lenses, and (b) in one embodiment, a reflective optical surface 520 of free space, with an ultra-wide angle, that is, an FS / UWA / RO 520 surface, which by necessity is curved.
The surface 520 can be purely reflective or can have both reflective and transmissive properties, in which case, it can be considered as a type of "beam splitter". Surface 520 is referred to in this document as a "free space" surface because its local spatial positions, local surface curvatures and local surface orientations are not linked to a particular substrate, such as the xy plane, but certainly , during the surface design, are determined using the fundamental optical principles (for example, the Fermat and Hero minimum time principle), applied in three-dimensional space.
The 520 surface is referred to as an "ultra-wide angle" surface because, during use, at the very least, it does not limit the dynamic foveal field of view of a nominal user eye.
As such, depending on the optical properties of the monitor mount with which the FS / UWA / RO surface is used, the global HMD optics can be different, conventional, differently formed optical systems that have an output pupil that limits the user's field of vision, the operative pupil for various types of optical systems disclosed in this document, will be the entrance pupil of the user's eye as opposed to one associated with the external optical system.
Concomitantly, for these modalities, the field of view provided to the user will be much larger than conventional optical systems where even a small mismatch of the user's eye with the exit pupil of the external optical system can substantially reduce the content of information available to the user and a large misalignment can cause the whole image to disappear.
The FS / UWA / RO 520 surface can completely surround one or both eyes, as well as the 515 monitor mount. In particular, the surface can curl around the sides of the eyes and towards the sides of the face in order to expand the available horizontal field of view.
In one embodiment, the FS / UWA / RO 520 surface can extend up to 180 ° or more (for example, more than 200 °). As illustrated in figure 7, the HMD can include two FS / UWA / RO 520R and 520L surfaces, for the two eyes of the user, which are supported separately by the frame and / or part of the nasal crest 710 (shown below). Alternatively, the HMD can employ a single FS / UWA / RO surface that serves both eyes with a unique structure, some portions of which are viewed by both eyes and other portions of which are viewed by only one eye.
As indicated immediately above and as shown in figure 7, the head mounted display apparatus 100 may include a portion of the nasal ridge 71 O.
The nasal crest part can be a vertical bar or wall that provides a separation between two FS / UWA / RO surfaces, one for each of the user's eyes.
The 71 O nasal crest part can also provide a separation between the fields of vision of the user's two eyes.
In this way, the user's right eye can be shown a first representation of three-dimensional physical reality in the environment, showing a first image to the right eye through a first 515R monitor assembly and a first FS / UWA / RO 520R surface, although at the user's left eye 5 a three-dimensional representation of physical reality in the environment is shown, displaying a second image for the left eye through a second 515L monitor assembly, and a second FS / UWA / RO 520L surface.
A separate monitor mount / reflective surface combination thus serves each user's eye, with each eye seeing the correct image for its location with respect to three-dimensional physical reality in the environment.
Separating the two eyes of the user, the part of the ridge 71 O allows the image applied to each eye to be optimized independently of the other eye.
In some embodiment, the vertical wall of the nasal crest part may include two reflectors, one on each side, to allow the user to see the image as he / she rotates his / her eyes nasally, either to the left or to the right.
Although illustrated in the context of a curved beam splitter, a part of the nasal crest can also be used with modalities that employ non-curved (flat) beam splitters. The at least one 515 monitor mount can be mounted inside the FS / UWA / RO 520 surface and can be arranged horizontally or at a slight angle with respect to the horizon.
Alternatively, at least one monitor assembly can be located off the FS / UWA / RO surface.
The inclination or angle of the 515 monitor assembly or, more particularly, its at least one light emitting surface, will generally be a function of the location of the monitor's pixels, images and / or pieces of information that must be reflected from the surface 520. Regardless of the angle at which the monitor mount is mounted, the light from the pixels needs to point towards the mirror, as long as it is a tight beam and the off-center axis power will be low.
In certain embodiments, the head-mounted display apparatus 500 is configured to create an internal cavity, with the surface
FS / UWAIRO being reflective into the cavity.
For an FS / UWAIRO surface that has transmissive properties, the monitor image or information from at least one monitor assembly is reflected into the cavity and into the user's eye from the surface, while simultaneously 5 the light also penetrate the user's eye cavity and eye from the outside world through the reflective surface.
The head-mounted display device may include a 525 electronics package to control images that are displayed on at least one 515 monitor assembly. In one embodiment, the 525 electronics package 1 includes accelerometers and gyroscopes that provide information location, orientation and position required for images synchronized from at least one 515 monitor assembly with user activities.
The power and video to and from the head mounted display device 500 can be provided via a transmission cable 530 coupled to the electronics package 525. Video and other information can also be provided via wireless medium where the 525 electronics package provides a transceiver.
A set of cameras 540 can be located on opposite sides of the head mounted display device 500 to provide input to the electronics package, for example, software or firmware within the electronics package, to help control the computer generation of, for example, "augmented reality" scenes. The 540 camera set can be coupled with the 525 electronics package to receive power and control signals and provide video input to the electronics package software.
In operation, the 525 electronics package, including accelerometers and / or gyroscopes, and, optionally, a global positioning system (GPS) module, can provide location, guidance and position information to synchronized images on at least one monitor mount 15 for user activities.
The information is used by the electronics package 525 to record where the 51 O device frame is in physical reality and to superimpose its images on the external view.
In some modalities, the feedback from the 540 cameras can be used by the electronics package 525 to synchronize the displayed images with the visualized reality.
This can be accomplished by aligning the terrain, or explicitly positioned targets, which actually occur as 5 provided by the 540 cameras with the stored terrain positions and known images displayed with respect to the stored terrain.
Once the terrains are aligned, images can be placed on the monitor screen so that they are included in the field of view and appear on the actual terrain as intended. 1O As indicated above, the image display system used in the head-mounted display device can take many forms, now known and subsequently revealed.
For example, the system may employ small, high-resolution liquid crystal displays (LCDs), light-emitting diode (LED) monitors, and organic light-emitting diode (OLED) monitors, including flexible OLED screens.
In particular, the image display system may employ a small high-definition form factor display device with high pixel density, examples of which can be found in the cell phone industry.
A fiber optic bundle can also be used in the image display system.
In many ways, the image display system can be considered to function as a small screen television.
Whether the image display system produces polarized light (that is, in the case where the image display system employs a liquid crystal display where all colors are linearly polarized in the same direction), and whether the FS / UWA surface / RO is orthogonally polarized to the light emitted by the monitor, so the light will not leak out of the FS / UWA / RO surface.
The information displayed and the light source itself, consequently, will not be visible outside the HMD.
The overall operation of an exemplary modality of an optical system built in accordance with the present disclosure, specifically, an optical system for an HMD with "augmented reality", is illustrated by the light trails of figure 6, specifically, the light rays 560 , 565 and
570. In this modality, the FS / UWA / RO 520 surface has both reflective and transmissive properties. When using the transmissive properties of surface 520, light beam 570 enters the environment through the surface and proceeds towards the user's eye. From the same region 5 of the surface 520, the light beam 560 is reflected by the surface (using the reflective properties of the surface) and joins the light beam 570 to create the combined light beam 565 that enters the user's eye when the user looks in the direction of the 580 point, that is, when the direction of the user's eye is in the direction of the 580 point. Although so looking, the capabilities of the user's peripheral vision allow the user to see the light from or - other points in the environment that pass through surface 520, again using the transmissive properties of the surface. In particular, the user's peripheral vision capabilities allow the user to see the light farther sideways and vertically from just around the 580 point, through the surface of the 520 splitter lens system. As seen in the figure 8, where the reference numbers are the same for the elements of figures 5, 6 and 7, the vision system consists of three parts, the monitor assembly 515, the beam splitter lens system 520 and the eye 810. With respect to the monitor assembly, the lens system of the beam splitter 520 functions as a reflector. The 810 eye has an internal lens 820. The light beam 560 is emitted from a pixel over the at least one display system of the assembly 515 and is collimated (or partially collimated) by the associated pixel lens of the assembly. The ray of light 560 will appear at a point in the retina of the eye after being reflected by the lens system of the beam splitter 520 and is illustrated as the rays of light 565 and 830. The term light ray in this document means a surface normal to the wavefront of the light emanating from the source, and taking the shortest optical path from the emitter to the detector, in this case the retina. What the eye sees, however, is a virtual image that appears in the space in front of it, at a distance represented by vectors 840 and
850. For a virtual image at infinity 860, the distance from the rays is the sum of vectors 840 and 850. The mirror / beam splitter of the lens system of beam splitter 520 is shown as curved in this representation, but it can be flat.
If the mirror is curved, then the dioptric correction provided by the mirror must be subtracted from the dioptric correction provided by the pixel lenses in order to allow the eye to focus on the image. 5 A ray trace in figure 9 illustrates a wavefront of light that emanates from a pixel of a monitor assembly 905 that is collimated before leaving the assembly and emerges as 910 light of O (zero) diopter. for a reflective mirror or beam divider 915. There is no divergence, or very little at this point and also in a reflected beam 920. 1O The beam of light can also go directly into an eye 930 and can be focused when at this point.
In the modality shown in this document, the reflected beam 920 is reflected outside the reflective mirror or beam divider 915 and travels towards the pupil of the eye.
This is equivalent to the light coming from an essentially infinitely distant point represented by the 925 line, and the light wavefront is flat, resulting in the parallel wavefront surface normals, shown as reflected beam 920, through of the entrance pupil to eye 930. Figure 1O illustrates the reflected parallel parallel file 920 entering eye 930 through pupil 1005 and being focused at point 101 O over the retina fovea 1015 where the highest vision accuracy occurs.
The surrounding retina responds to the broader field of view, but with lower acuity.
Also entering the eye from this direction can be light from a surrounding external environment that is joined with the enlarged image provided by the light emitted from the collimated pixel.
As discussed above, the previous optical systems used in HMDs that employed reflective optical surfaces were pupil-forming and thus had limited viewing areas, a typical field of view being ~ 60 degrees or less.
This greatly limited the value and capacity of the previous head mounted displays.
In various modalities, the head mounted monitors disclosed in this document have much broader fields of view (FOV), thus allowing much more optical information to be provided to the user compared to HMDs that have smaller fields of view. The wide field of view can be greater than 100 °, greater than 150 °, or greater than 200 °. In addition to providing more information, the wide field of view allows additional information to be processed by the user in a more natural way, enabling better immersive and augmented reality experiences through a larger comparison of the images displayed for physical reality. Specifically, in the exemplary modality illustrated in figure 11, for a straight ahead look direction, the eye is able to capture an area of total visualization represented in figure 11 by the curved FS / UWA / RO surfaces 201 and 202, corresponding to at least minus 150 degrees of horizontal field of view (FOV) for each eye (for example, ~ 168 degrees of horizontal FOV). This field of view is composed of the foveal field of view and its peripheral field of view. In addition, the eye is allowed to move freely around its center of rotation to target the combined foveal + peripheral field of view in different directions of the eye, as the eye naturally does when visualizing the physical world. The optimal systems disclosed in this document that allow the eye to obtain information over a range of motion in the same way as the eye does when visualizing the natural world. Looking at figure 11 in more detail, this figure is a simplified line representation of the front of the user's head 200 as seen from the top. It shows FS / UWAIRO 201 and 202 surfaces placed in front of the user's eyes 203 and 204. As discussed above, the FS / UWAIRO 201 and 202 surfaces can rest on the user's nose 205 where they enter together in the front from the center 214 of the user's head
200. As discussed in detail below, the local normals and the local spatial locations of surfaces 201 and 202 are adjusted so that the images produced by at least one monitor assembly (not shown in figure 11) cover at least 100 °, for example, in certain modalities, at least 150 ° and, in other modalities, at least 200 °, of the horizontal FOV for each eye, where, as discussed above, the fields of view of
100 °, 150 ° and 200 ° generally correspond to a nominal user's foveal dynamic field of view, foveal + peripheral static field of view, and foveal + peripheral dynamic field of view, respectively.
As also discussed below, the local radii of curvature are also adjusted 5 to provide distant virtual images when the optical effects of the curved reflective surface are combined with the optical effects of pixel lenses.
For example, local normals and local spatial locations can be adjusted to cover the static field of view of ~ 168 degrees complete, straight, horizontal front of the user, for each eye, with 168 10 degrees extending from edge to edge of the FS / UWAIRO 201 and 202 surfaces, as shown by lines 210,211 and 212,213. The lines of sight thus correspond to the wide static field of view (foveal + peripheral) that is provided to the user.
In addition, the user is free to move his eyes around the rolling centers 215 and 216 while still seeing the computer generated image.
In figure 11, as well as in figure 18, FS / UWAIRO surfaces are shown as ball pieces for ease of presentation.
In practice, surfaces are not spheres, but have more complex configurations so that their local normals, local spatial locations and local radii of curvature will provide the static and dynamic fields of view and the desired distances to virtual images. .
Also, in figure 11, the right side of the head mounted display device operates in the same way as the left side, it being understood that the two sides may differ if desired for particular applications.
Also, for the case of presentation, figures 11 to 18 do not show an optical system that includes pixel lenses between at least one image display system and the reflective optical surface, it being understood that according to the present disclosure such lenses are used in the modalities disclosed in this document.
Figures 12 and 13 also illustrate the static and dynamic fields of view provided by the FS / UWAIRO surfaces disclosed in this document.
Figure 12 shows a user's nominal right eye 71 having a straight ahead 73 look direction. The foveal + peripheral field of view of the eye is shown by arc 75, which has an angular extension of ~ 168 °. Note that for ease of presentation, figures 12-13, the field of view is shown in relation to the center of rotation of the user's eye as opposed to the center or edges of the user's pupil.
In fact, the large field of view (eg ~ 168 °) obtained by a human eye is a result of the large angular extension of the retina that allows highly oblique rays to enter the user's pupil and reach the retina.
Figure 13 schematically shows the interaction of the field of view in Figure 12 with an HMD having: (a) an image display system whose at least one light emitting surface 81 has a first light emitting region 82 (illustrated as a square) and a second light emitting region 83 (illustrated as a triangle) and (b) an FS / UWAIRO surface having a first reflective region 84 that has a first local normal 85 and a second reflective region 86 which has a second local normal 87. As indicated above, the FS / UWAIRO surface is both a "free space" surface and a "ultra-wide angle" surface. In addition, as indicated above and discussed in more detail below, the surface can participate in the collimation (or partial collimation) of the light entering the user's eye.
Such collimation makes the virtual image produced by the FS / UWAIRO surface and the pixel lenses appear to be located, for example, at a long distance from the user, for example, 30 m or more, which allows the user to easily focus on the virtual image with a relaxed eye.
The "free space" and "ultra-wide angle" aspects of the FS / UWAIRO surface can be achieved by adjusting the normal local surfaces so that the user's eye sees the light-emitting regions of at least an image display system as coming from predetermined regions of the FS / UWAIRO surface (predetermined locations on the surface). For example, in figure 13, the HMD designer may decide that it may be advantageous for a virtual image 88 of square to be viewed by the central portion of the user's retina when the direction of the user's gaze is straight ahead and for an image virtual triangle 89 be viewed by the central portion of the user's retina when the look direction is, for example, ~ 50 ° to the left of the straight front.
The designer can also configure the at least one image display system, the reflective optical surface, the pixel lenses and any other optical components of the system so that the virtual square image can be straight ahead and the virtual triangle image can be 50 ° to the left of the straight front during the 1st use of the HMD.
In this way, when the direction of the user's gaze (line of sight) intersected the straight FS / UWA / RO surface on the virtual square image it can be visible in the center of the user's eye as desired, and when the direction of the gaze of the user (line of sight) intersected the FS / UWA / RO surface at 50 degrees from the left to the straight front, the virtual triangle image can be visible in the center of the user's eye, as desired.
Although not illustrated in figures 12 and 13, the same approaches are used for the vertical field of view, as well as for off-axis fields of view.
More generally, in the design of the HMD and each of its optical components, the designer "maps" at least one light-emitting surface of the monitor to the reflective surface so that the desired portions of the monitor are visible to the user’s eye when the user’s gaze is in particular directions.
Thus, as the eye sweeps across the field of view, both horizontally and vertically, the FS / UWA / RO surface radiates different portions of at least one light-emitting surface of the display system image into the user's eye.
Although the above discussion was in terms of the center of a nominal user retina, the design process can, of course, use the location of a nominal user fovea instead, if desired.
It should be noted that in figure 13, any rotation of the user's eye to the right makes the virtual image 89 of the triangle no longer visible to the user.
Thus, in figure 13, any look direction that is in the straight ahead or to the left of the straight front provides the user with virtual images of both square and triangle, while a look direction to the right of the straight front provides a virtual image only of the square.
The accuracy of the virtual images, of course, will depend on whether the virtual images are perceived by the user's foveal vision or by the user's peripheral vision.
If the HMD designer placed the virtual square image farther to the right in figure 13 while leaving the virtual image 1O of the triangle farther to the left, there may be directions of the gaze where only the virtual square image was visible and other directions of the gaze where only the virtual triangle image was visible.
Likewise, based on the principles disclosed in this document, the designer can arrange the virtual image of the square and the virtual image of the triangle so that the virtual image of the triangle has always been visible, with the virtual image of the square being visible to some look directions, but not for others.
As another variation, the HMD designer can place the virtual square and triangle image in places where for one or more directions of the eye, no image was visible to the user, for example, the designer can place the virtual images just outside the field from the user's static view to a straight ahead look direction.
The flexibility provided to the HMD designer by the present disclosure is thus readily apparent.
In one embodiment, the "free space" and "ultra-wide angle" aspects of the reflective surface are obtained using the principles of Fermat and Hera according to which light travels along the shortest optical path (time Minimum). Patent Application No. 13 / 211,389 commonly assigned and copending, filed on August 17, 2011 in the names of G.
Harrison, D.
Smith, and G.
Wiese, entitled "Methods and Systems for Creating Free Space Reflective Optical Surfaces," and identified by agent document number IS-00354, whose contents are incorporated into this document by reference, describes a modality in which the main Fermat and Hera principles are used to design surfaces
FS / UWA / RO suitable for use in HMDs.
See also Patent Application No. 13 / 211,372, filed on August 17, 2011 in the names of G.
Harrison, D.
Smith, and G.
Wiese, entitled "Head Mounted Display Device Using One or More Reflective Optical Surfaces," and 5 identified by agent document number IS-00267, the contents of which are also incorporated into this document by reference.
Through Fermat and Hero's minimum time principles, any "desired portion" of at least one light-emitting surface of an image display system (for example, any 1O pixel an image display system) can be led to have any desired reflection point on the FS / UWA / RO surface, with the proviso that the optical path from the desired portion of the at least one light emitting surface to the reflection point on the FS / UWA / RO surface and then for the center of rotation of the user's eye to be at a more extreme point.
A more extreme point in the optical path means that the first derivative of the optical path length has reached a value of zero, meaning a maximum or a minimum in the optical path length.
A more extreme point can be inserted at any point in the field of view creating a local region of the reflective optical surface whose normal bisector vector (a) from the local region to the user's eye (for example, a vector from the center of the local region to the center of the user's eye) and (b) a vector from the local region to the "desired portion" of the light emitting surface (for example, a vector from the center of the local region to the center of the "desired portion" of the light-emitting surface). Figures 14 to 16 illustrate the process for the case where the "desired portion" of the at least one light emitting surface of the image display system is a pixel.
Specifically, figure 14 shows a light emitting surface 151 O of an image display system composed of a generally rectangular series of pixels that are emanating light towards the front of a head mounted display device towards the beam of light 1515. The beam of light 1515 jumps out of the reflective optical surface 1520, which for ease of presentation is shown as a plane in figure 14. In reflection, the beam of light 1515 becomes the beam of light 1525 that enters in the eye of the user 1530. 5 For the purposes of determining the normal surface of the reflector for each pixel, it is only necessary to determine the three-dimensional bisector of the vectors corresponding to the light beams 1515 and 1525. In figure 14, this bisector vector is shown in two-dimensional form as the line
1535. The bisector vector 1535 is normal for the reflective optical surface at the 1st reflection point 1540, which is the location on the surface 1520 where the pixel 1545 of the light emitting surface 1510 will be visible to the HMD user. Specifically, in operation, the pixel 1545 on the surface of the monitor 151 O emits the light beam 1515 that jumps out of the reflective optical surface 1520 at an angle established by the surface normal corresponding to the bisection vector 1535 and its perpendicular plane 1550 , given by the principles of Fermat and Hero, a pixel reflected at the reflection point 1540 that is seen by the eye 1530 along the beam 1525. In order to precisely calculate the surface normal at the reflection point 1540, the beam 1525 can pass approximately through the center 1555 of the user's eye 1530. The results will remain approximately stable even if the user's eye rotates, they become a peripheral view until, as discussed above in connection with figures 12 and 13, the the eye rotates so much that the monitor region cannot be seen with any of the user's foveal or peripheral vision. To calculate the position of the surface normal, the use of quaternion methods can be used, where q1 = beam orientation 1515 q2 = beam orientation 1525 and q3 = the orientation of the desired surface normal 1535 = (q1 + q2 ) / 2 The surface normal can also be described in vector notation, as illustrated in figure 17. In the following equation and in figure 17, point N is a unit distant from point M in the center of the region of interest of the surface reflective optics and is in the normal direction perpendicular to the tangent plane of the reflective optical surface at point M.
The tangent plane of the reflective optical surface at point M is controlled to satisfy the relationship expressed in the following equation so that, in three-dimensional space, the surface normal at point M bisects the line from point M to point P at center of the pixel of interest and the line from point M to point C in the rolling center of the user's eye (for reference, point C is approximately 13 mm from back to front of the eye). The equation describing point N over the surface normal at point M is: N = (PM) + (CM) + MI (PM) + (CM) I where all points N, M, P and C have the components [ x, y, z] that indicate their position in three-dimensional space in an arbitrary Cartesian coordinate system.
The resulting normal N-M vector has the Euclidean length IN-MI = 1 where the two vertical bars represent the Euclidean length, calculated as follows:
IN -MI = J (xN -xM) 2 2 + (yN-YM) + (zN -zM) 2
As a numerical example, consider the following values M, PeC: M = [xM, yM, zM] = [4, 8, 10] p = [2, 10, 5] c = [6, 10, 5] The point along the normal, N, is calculated as follows: P- M = [(2-4), (10-8), (5-10) = [2,2, -5] CM = [(6- 4), (1 0-8), (5-1 0)] = [2.2, -5] (PM) + (CM) = [0.4, -1 O] and
N = (PM} + (CM} + MI (P- M} + (c- M} l = {[- 2,2, -5] + [2,2, -5]} / 1 0,7703296143 + [4.8, 1O] = [0.0.3713806764, -0.928476691] + [4,8,10] = [4, 8.3713806764, 9.0715233091] 5 The geometry is shown in figure 15, where the bisector is between the two longest vectors.
The above is, of course, simply a representative calculation that serves to show the use of the Fermat and Hero principles of minimum time in determining the flat angular constraints of local people for a field of points that make up multiple surfaces of free space (free form) of the reflection regions intended to present a virtual image contiguous to the viewer.
The only real constant is the center of the user's eye, and the natural field of view of the eye.
All other components can be iteratively updated to an appropriate solution for a given image display system and the orientation of the reflective optical surface is achieved.
Seen in another way, the reflection locations of the pixel images, M1, M2, ..., Mn and their associated normals and curves can be considered as a matrix that is "deformed" (adjusted) so that the FS / UWNRO surface achieves the processing of desired virtual images from computer generated images formed by the image display system.
Applying the principles of Fermat and Hero, it should be noted that in some modalities it will be desirable to avoid the situation where the normals are adjusted so that the user sees the same pixel reflection in more than one point.
It should also be noted that in some embodiments the local regions of the reflective optical surface may be very small and may even correspond to a point on the reflector, with the points of transmutation at other points to make a uniform surface.
To facilitate presentation, the effects of the presence of pixel lenses were not explicitly included in the discussion above of using the Fermat and Hero principles to design an FS / UWNRO surface.
In practice, the presence of pixel lenses is readily included in the analysis using as input for Fermat and Hero calculations the propagation directions of the light beams after they have passed through the pixel lenses (and any other optical elements used in the global optical system). 5 These propagation directions can, for example, be determined using Gaussian optical techniques.
If desired, the Fermat and Horro calculations can be repeated for different initial light vergence adjustments as controlled by changing the power of the pixel lens lenses until the desired virtual images are obtained. 1O In order to ensure that the user can easily focus on the virtual image of the "desired portion" of at least one light-emitting surface (for example, the virtual image of a pixel), in certain modes, the curvature radius of the region surrounding the reflection point (reflection area) is controlled so that after passing through the pixel lens and reflecting from the FS / UWA / RO surface, a collimated (or substantially collimated) image reaches the user .
As noted above, a collimated (or substantially collimated) image has optical rays that are more parallel, as if the image originated at a distance from the user, tens to hundreds of meters away.
In order to obtain such a surface, depending on the collimation power of the pixel lenses, the radius of curvature of the reflection region of the reflective optical surface corresponding to the "desired portion" of at least one light emitting surface (the pi desired light-emitting xel) can be kept within a radius on the order of (but greater than) half the distance from the reflection region to the current "desired portion" of the light-emitting surface (current pixel) on the monitor.
More particularly, the radius will be in the order of half the apparent distance from the reflection region to the "desired portion" of the light-emitting surface when the "desired portion" is viewed through the pixel lens associated with from the location of the reflection region.
Thus, in one modality, the normal pixel vector interflected from the pixel of interest to the adjacent pixels satisfies a relationship that allows them to establish a radius of curvature in the order of a-
approximately half the length of the vector from the reflected pixel location on the reflective surface to the apparent location of the monitor pixel as seen through its associated pixel lens.
Adjustments that affect this parameter include the size of at least one light emitting surface and whether the at least one light emitting surface is curved.
Figure 16 illustrates this modality.
In order to control the radius of curvature of the region surrounding the pixel reflection so that a collimated (or substantially collimated) image reaches the user, two adjacent pixel reflection regions, such as the 540 reflection point, are considered 1O.
More regions can be considered for better balance, but two are sufficient.
With reference to figure 16, two pixel reflective points 1540 and 161 O are shown with respect to two pixels, 1545 and 1615, respectively, on the surface of the monitor 1510. The surface normals at points 1540 and 1610 are calculated together with the angle between two directions.
The radius of curvature is calculated by knowing these angles and the distance between points 1540 and 1610. Specifically, the surface configuration and, if necessary, the spatial location of the surface are adjusted until the radius of curvature is in the order of approximately half the average. beam lengths 1515 and 1620 when these lengths are adjusted for the effects of pixel lenses.
In this way, zero or near zero diopter light can be provided to the user's eye.
As indicated above, this is equivalent to light from an essentially infinitely distant point, and the light wavefront is flat, resulting in surface normals parallel to the light wavefront.
In addition to controlling local radii of curvature, in certain ways, as a first order point solution to have a collimated (or substantially collimated) image entering the eye, at least a light emitting surface it is nominally located at a distance of a focal length away from the FS / UWA / RO surface, where the focal length is based on the average value of the radii of curvature of the various reflective regions that make up the FS / UWA / RO surface.
The result of applying the principles of Fermat and Hero is a set of reflective regions that can be combined into a uniform reflective surface.
This surface, in general, will not be spherical or symmetrical.
Figure 18 is a two-dimensional representation of that FS / UWA / RO 1520 surface. As discussed above, surface 1520 can be constructed so that the radii of curvature at points 1710 and 1720 are fixed at values that, when combined with the collimating effects of pixel lenses, provide a relaxed view of the image from at least one light-emitting surface of the image display system being reflected by the surface.
In this way, looking in the right direction represented by the 1730 line will provide a collimated (or substantially collimated) virtual image to the 1530 eye, as it will look in a different direction represented by the 1740 line. To enable a uniform viewing transition across the entire field of view, the FS / UWAIRO surface regions can be transitioned uniformly from one control point to another, as can be done using the Non-Uniform Rational B Streak (NURBS) technology for striated surfaces, creating thus a smooth transition across the reflective surface.
In some cases, the FS / UWAIRO surface may include a sufficient number of regions so that the surface becomes uniform at a fine grain level.
In some embodiments, different magnifications for each portion of the monitor (for example, each pixel) can be provided using a gradual gradient to allow for better manufacturing, design and image quality.
From the above, it can be seen that the monitor mounted on the global head can be designed using the following exemplary steps: determining a desired field of view, choosing a monitor surface size (for example, width and height dimensions), choosing an orientation for the monitor surface with respect to a reflective surface, choose a candidate location for the pixel lens between the monitor and the reflective surface, choose a candidate configuration for the pixel lens, catalog the position of each pixel on the monitor surface as seen through the pixel lens, and choose a location for displaying each pixel from the monitor surface on the reflective surface.
The surface of the monitor and pixel lenses can be placed above the eye and tilted towards the reflective surface, allowing the curvature of the reflective surface to reflect light to the user's eye.
In other embodiments, the monitor surface and pixel lenses can be placed in other positions, such as beside the eye or above the eye, with the reflective position and curvature selected to reflect light from the monitor surface. appropriately, or be tilted to a different degree.
In certain modalities, a three-dimensional instantiation or mathematical representation of the reflective surface can be created, with, as discussed above, each region of the reflective surface being a local region that has a normal that bisects the vectors from the center of that region to the center of the user's eye and to the center of a pixel on the surface of the monitor (the center of the pixel's apparent position resulting from the presence of the pixel lens associated with the pixel). As also discussed above, the radii of curvature of the regions surrounding a pixel reflection can be controlled so that, in combination with the collimating effects of the pixel lens, a collimated (or substantially collimated) image reaches the user through the field of vision.
Through computer-based interactions, changeable parameters (for example, local normals, local curves, and local spatial locations of the reflective surface and the locations, powers and structures of pixel lenses) can be adjusted to a combination (fixation) of the parameters to be identified which provides a desired level of optical performance across the field of view, as well as a workable design that is aesthetically acceptable.
During use, a non-symmetrical FS / UWA / RO surface (which, in certain embodiments, is constructed from a striated surface of multiple local focus regions) in combination with pixel lenses forms a virtual image of hair. minus a light-emitting surface of the image display system that is stretched across a wide field of view.
The FS / UWA / RO surface can be considered as a progressive mirror or progressive curved beam divider or a free-form mirror or reflector.
As the eye scans across the field of view, both horizontally and vertically, the curved FS / UWAIRO surface radiates different portions of at least one light-emitting surface of the image display system into the user's eye. 5 In several modalities, the global optical system can be manufactured in large quantities at low cost while maintaining an image quality measured with typical human virtual resolution. IV.
HMDs employing a non-curved reflective optical surface Figure 19 is a side view of a display device mounted on the head 600. The display device mounted on the head 600 can be a pair of binocular viewers with augmented reality. gives.
The head mounted display device 600 includes a 61 O sighting member adapted to project or radiate from the user's face when used by the 605 user. The 61 O sighting member is configured to support at least one mounting of monitor 615 above the eyes of the user 605. For example, at least one monitor assembly 615 can be arranged horizontally or at a slight angle with respect to the horizon.
The at least one 615 monitor mount has a pixel-by-pixel lens of light emission included in the mount.
The head mounted display device 600 also includes a 620 plane beam splitter lens system oriented at a slight angle with respect to a vertical plane to reflect collimated or substantially collimated light from at least one monitor assembly 615 for the eyes of the user 605. The head-mounted display device 600 provides a close view and a wide field of view.
At least one 615 monitor mount in this mode can be larger than in other modes to provide a wide field of view since in this mode there is no optical curvature in the 620 beam splitter. A 625 electronics package controls the image that is displayed.
The 625 electronics package can include accelerometers and gyroscopes in one mode.
The power and video to and from the head mounted display device 600 can be provided via a 630 transmission cable or wireless medium where the electronics package 625 provides a transceiver or wired interface.
A set of 640 cameras can be located on each side of the HMD to provide input to a feature, such as a software or firmware module running on the 625 electronics package, to control the computer generation of scenes with augmented reality.
The elements 650, 655, 656 and 657 represent various forms of support to hold the device 600 in a desired position with respect to the eye, such as bands or cords that can be adjustable in some modalities.
The operation of the system of figure 19 is illustrated by the light rays 660, 665 and 670. As shown, the light beam 670 enters from the environment through an external surface of the 620 flat beam splitter lens system, is combined with light from at least one monitor mount 615 that strikes the inner surface of the 620 flat beam splitter lens system to create the combined light beam 665 that enters the user's eye when the user looks in the direction from the 680 point. The user's peripheral vision capabilities also allow the user to see the light still distant laterally and vertically from just around the 680 point, through the surface of the 620 beam splitter lens system. at least one 615 monitor mount can be made to bend in a curved, cylindrical manner to allow better access to pixel information through the eye's optical system and the 620 beam splitter system. V.
Direct Vision HMDs In addition to the applications above, pixel lenses can also be used to direct the visualization of an image display system without an intervening reflective optical surface.
Such a configuration will be immersive, but it can include information from the outside world through the use of one or more video cameras.
When using pixel lenses, an image from a monitor can be projected into the wide field of view in a compact space.
Through the use of pixel lenses, the user can be the image that is produced if it comes from a distance, allowing the user's eye to focus on it.
Also, a maximum undistorted field of view is obtainable with this approach.
Collimation of the beam is carried out on at least one monitor assembly itself, so no further collimation is required.
The user looks directly at at least one screen in close proximity, and the at least one image display system can become as large as necessary to give the expected field of view.
Pixel lenses allow viewing of the display system when positioned close to the eye.
The optimization can be performed by manipulating the curvature of the display system, pixel size, pixel lens properties, and the distance from the user's eyes to obtain the most useful package.
Figure 20 illustrates a display device mounted on the head 1100 being used by a user 1105. The display device mounted on the head 11 00 can be a pair of 111 O immersive binocular viewers.
Viewers 1100 can take a shape similar to safety glasses or goggles that supports at least one 1115 monitor mount with one pixel lens for each pixel of light emitted in the mount.
At least one 1115 monitor mount is positioned directly in the user's field of view and adjusted close to viewing with the pixel lens.
At least one 1115 monitor mount is mounted on the surfaces of safety glasses or glasses directly in front of the user's eyes using, for example, the 1120 bracket, and oriented essentially vertically so that pixels emanate light directly towards the user's eyes for an immersive virtual world experience.
An 1125 electronics package is provided that includes processing circuits, accelerometers and gyroscopes supported by the board in a way to control the image being displayed.
The power and video to and from the 111 O binocular viewers can be provided via an 1130 transmission cable or wireless means.
A set of 1170 cameras is located on each side of the HMD and supported by the frame to provide input to a software package, for example,
a software package that is part of the 1125 electronics package, to help control the computer generation of scenes with immersive reality.
As seen in figure 21, where the reference numbers are the same for the same elements as in figure 20, the vision system of this 5 modality consists of two parts: (a) at least one monitor assembly 1115 and (b) eye 810, which has an internal lens 820. The light emitted from a pixel of monitor assembly 1115 that has passed through the pixel lens associated with the pixel is represented at 565. After going through the lens of eye 820 , this light will appear at a point on the user's retina.
What the eye sees, however, is a virtual image that appears in the space in front of it, at a distance represented by vectors 840 and 850. For a virtual image at infinity 860, the distance from the radius is the sum of the vectors 840 and 850. At least one monitor assembly 1115 is shown as flat in this representation, but it can be curved or flat.
Figure 22 is a ray diagram illustrating light from a head mounted display device entering an 930 eye. The light is shown emanating out of a monitor assembly 1115 that has a curved arrangement.
In particular, the light is shown as emanating from three portions of the outer surface 1120 of the monitor assembly 1115. All light pencils from the three portions, such as 1220, are collimated and able to be seen and focused by eye 930 as seen at points 101 O on retina 1015. VI.
General Considerations In terms of the overall structure of the HMD, table 1 describes representative examples that do not limit the parameters that a HMD monitor constructed in accordance with the present disclosure will typically encounter.
In addition, the HMD monitors disclosed in this document will typically have an interpixel distance that is small enough to ensure that a convincing image is established on the user's visual plane.
Several features that can be included in the overhead monitors disclosed in this document include, without limitation, the following, some of which have been referenced above: (1) In some embodiments, the reflective optical surface (when used) may be semitransparent, letting the light come from the outside environment.
The images generated on the internal monitor can then overlay the external image.
The two images can be aligned through the use of location equipment, such as gyroscopes, cameras, and software manipulation of the computer generated image so that the virtual images are in the appropriate locations in the external environment.
In particular, a camera, an accelerometer and or gyroscopes can be used to help the device record where it is in physical reality and to superimpose its images on the external view.
In these modalities, the balance between the relative transmittance and reflectance of the reflective optical surface can be selected to provide the user with superimposed images with appropriate brightness characteristics.
Using the correct balance of light admitted from the environment outside the HMD and the light generated internally allows the reflection to be seen on an internal surface of the HMD that appears to be in the environment outside the glasses.
Also, in these modalities, the image of the real world and the image generated by the computer may appear that both are at approximately the same apparent distance, so that the eye can focus on both images at once. (2) In some embodiments, the reflective optical surface (when used) is kept as thin as possible in order to minimize the effect on the position or focus of the external light passing through the surface. (3) In some embodiments, the head-mounted display device provides a field of view for each eye of at least 100 degrees, at least 150 degrees, or at least 200 degrees. (4) In some embodiments, the field of view provided by the display device mounted over the head to each eye does not overlap the user's nose to any great degree. (5) In some embodiments, the reflective optical surface (when used) may employ a progressive transition from your optical prescription across the field of view to maintain focus on the available display area.
available. (6) In some modalities, lightning tracking can be used to customize the parameters of the device for a particular implementation, such as military training, flight simulation, games and other commercial applications. (7) In some embodiments, the reflective optical surface (when used) and / or the monitor surface, as well as the pixel lens properties and locations, and the distances between the monitor and the reflective optical surface (when used) and between the reflective optical surface (when used) and the eye, 10 can be manipulated with respect to a Modulation Transfer Function (MTF) specification on the retina and / or fovea. (8) In some embodiments, the HMDs disclosed in this document can be implemented in applications such as, but not limited to, sniper detection, commercial training, military training and operations, and CAD manufacturing. (9) Although shown as flat in the figures, the image display system can also have a curved light emitting surface.
Once designed, the reflective optical surfaces disclosed in this document (for example, FS / UWA / RO surfaces) can be produced, for example, manufactured in quantity, using a variety of techniques and a variety of materials now known or subsequently developed.
For example, surfaces can be made of plastic materials that have been metallized to be appropriately reflective.
Polished plastic or glass plastic materials can also be used, excluding anti-reflective coatings on their reflective surface.
For "augmented reality" applications, reflective optical surfaces can be constructed from a transmissive material with small built-in reflectors, thus reflecting a portion of an incident wavefront while allowing light to be transmitted through the material.
For prototype parts, an acrylic plastic (for example, piezo
xiglas) can be used with the piece being formed by turning the diamond.
For production parts, both acrylic and polycarbonate can, for example, be used with the part being formed, for example, by injection molding techniques.
The reflective optical surface can be described as a detailed description of Computer Aided Design (CAD) or as a rational non-uniform Striated B (NURBS) surface, which can be converted into a CAD description.
Having a CAD file can allow the device to be made using 3-0 printing, where the CAD description results in an object 30 directly, without requiring machining. 1O The mathematical techniques discussed above can be coded in various programming environments and / or programming languages, now known or subsequently developed.
A currently preferred programming environment.
A currently preferred programming environment is the Java language running on the Eclipse Programmers interface.
Other programming environments such as Microsoft Visual C # can also be used if desired.
Calculations can also be performed using the Mathcd platform marketed by PTC of Nedham, Massachusetts, and / or the Matlab platform by Math Works, Inc., by Natick, Massachusetts.
The resulting programs can be stored on a hard drive, memory card, CO, or similar device.
The procedures can be performed using typical desktop computing equipment available from a variety of vendors, for example, DELL, HP, TOSHIBA, etc.
Alternatively, more powerful computing equipment can be used including "cloud" computing if desired.
From the above, it can be seen that in various modalities, a high resolution and wide field of view (wide angle) in a sunglasses-like HMD device can be provided.
The wide field of view can, for example, be made to any desired angle with larger and or more monitors.
The displayed image can be superimposed on the visualized physical reality of a surrounding environment.
The use of pixel lenses allows the user's eye to be in close proximity to the HMD screen while focusing on the distant scene, and the image from the screen also appears to be distant.
The HMD establishes and maintains a fixed relationship between the image display system, the pixel lens, and the user's eyes.
The pixel intensity can be individually controlled based on the distance of the image display system to the user's eyes or for modalities that employ a beam splitter, based on the distance from the image display system to the beam divider, the curvature of the image. beam splitter, and the distance of the beam splitter to the user's eye.
A variety of modifications that do not fall outside the scope and scope of 1 The spirit of the invention will be apparent to those skilled in the art from the disclosure above.
The following claims are intended to cover the specific modalities described in this document as well as modifications, variations and equivalents of these modalities.
TABLE 1 Name Description Units Minimum Maximum Distance from the super-mm 10 400 reflective surface from the eye Distance from the super-mm 10 400 reflective surface from the monitor Monitor size Horizontal mm 9 100 to r Vertical mm 9 100 Resolution of the hand- Horizontal pixels 640 1920+ nitor Vertical pixels 480 1080+ Weight of the HMD grams 1 1000 Size of the HMD Distance in mm 10 140 front of the face Pupil size mm 3a4 5a9 human
Name Description Units Minimum Maximum Super size For example, mm 30 78 reflective surface smaller than head width / 2 Number of super Units 1 3+ reflective optical surfaces Maximum illumination For example, fc , foot-candles 5,000 10,000 of the eye enough brightness (152.4 (304.8 ente for em) in) to allow viewing on a sunny day Battery life Hours 3 4 Optical resolution Blur Diameter of 1 10 more angular blur or RMS of arc minute Number of pairs of 20 140 taken from line / mm in pairs of fovea resolution line Maximum variation 20% in the apparent brightness of the image Image distortion- Percentage 5 maximum gem gem Maximum estimate Percent- 5 derived from gem / degree brightness
权利要求:
Claims (21)
[1]
1. Display device mounted on the head comprising: (I) a frame adapted to be mounted on the user's head; (11) an image display system having a light emitting surface comprising a series of light emitting pixels, said image display system being supported by the frame; and (111) a reflective optical surface supported by the frame; where: (a) the device comprises a series of pixel lenses located between the series of light-emitting pixels and the reflective optical surface, a pixel lens for each of the light-emitting pixels, called a lens - the pixel being aligned with and receiving light from its associated light emission pixel when using the device; and (b) the series of pixel lenses either alone or in combination with the reflective optical surface collimates or substantially collimates the light emitted from the light emitting pixel series during use of the apparatus.
[2]
2. Apparatus according to claim 1, wherein the reflective optical surface is flat and the series of pixel lenses alone collimates or substantially collimates the light emitted from the light emitting pixels during use of the apparatus.
[3]
Apparatus according to claim 1, wherein the reflective optical surface is curved and the series of pixel lenses and the reflective optical surface, in combination, collimate or substantially collimate the light emitted from the series of light emitting pixels while using the device.
[4]
4. Apparatus according to claim 1, wherein the reflective optical surface is a continuous surface that is not rotatable symmetrically about any coordinate axis of a three-dimensional Cartesian coordinate system.
[5]
5. Apparatus according to claim 1, wherein the image display system and the series of pixel lenses form a display assembly that is convexly curved towards the reflective optical surface.
[6]
6. Apparatus, according to claim 1, wherein during the use of the apparatus: 5 (i) the reflective optical surface and the series of pixel lenses produce virtual images spatially separated from portions separated spatially from the surface of light emission, at least one of the spatially separated virtual images being angularly separated from at least another of the virtual images spatially separated by at least 100 degrees, said angular separation being measured from the center of rotation of an eye of the nominal user ; and (ii) at least one point on the reflective optical surface is separated angularly from at least another point on the reflective optical surface by at least 100 degrees, said angular separation being measured from the center of rotation of an eye of the nominal user.
[7]
7. Head-mounted display apparatus according to claim 6, wherein: at least one of the spatially separated virtual images is angularly separated from at least another of the virtual images spatially separated by at least 150 degrees; and at least one point on the reflective optical surface is separated angularly from at least another point on the reflective optical surface by at least 150 degrees.
[8]
8. Head mounted display device according to claim 6, wherein: at least one of the spatially separated virtual images is angularly separated from at least another of the virtual images spatially separated by at least 200 degrees; and at least one point on the reflective optical surface is separated angularly from at least another point on the reflective optical surface by at least 200 degrees.
[9]
9. Head mounted display device according to claim 1, wherein the reflective optical surface is semitransparent.
[10]
1O. Display device mounted on the head comprising: (I) a frame adapted to be mounted on the user's head; (11) an image display system having a light emitting surface comprising a series of light emitting pixels, said image display system being supported by the frame; and (111) a reflective, free-space, ultra-wide angle optical surface supported by the frame; where (a) the apparatus comprises a series of pixel lenses located between the series of light-emitting pixels and the ultra-wide angle free-space reflective optical surface, a pixel lens for each of the pixels light emission, a pixel lens being aligned with and receiving light from its associated light emission pixel during the use of the device; and (b) during the use of the device, the free-space, ultra-wide angle reflective optical surface and the series of pixel lenses produce virtual images spatially separated from portions spatially separated from the light emitting surface, at least one of the spatially separated virtual images being separated angularly from at least another of the virtual images spatially separated by at least 100 degrees, the angular separation being measured from a center of rotation of a nominal user's eye.
[11]
Apparatus according to claim 10, wherein the series of pixel lenses in combination with the reflective optical surface of free space, ultra-wide angle collimates or substantially collimates the light emitted from the emission pixel series light when using the device.
[12]
Apparatus according to claim 10, wherein the series of pixel lenses alone collimates or substantially collimates the light emitted from the series of light emitting pixels during use of the apparatus.
[13]
13. Head mounted display apparatus according to claim 10, wherein at least one of the spatially separated virtual images is angularly separated from at least another of the virtual images spatially separated by at least 150 degrees. 5
[14]
14. Head mounted display device according to claim 10, wherein at least one of the spatially separated virtual images is angularly separated from at least another of the virtual images spatially separated by at least 200 degrees.
[15]
15. Head-mounted display apparatus according to claim 10, wherein the ultra-wide-angle, free-space reflective optical surface is semitransparent.
[16]
16. Display device mounted on the head comprising: (I) a frame adapted to be mounted on the user's head; and (11) a monitor mount supported by the frame, said monitor mount comprising: (a) an image display system having a light emitting surface comprising a series of light emitting pixels; and (b) a series of pixel lenses, a pixel lens for each of the light emitting pixels, said pixel lens being aligned with and receiving light from its associated light emitting pixel during the use of the device; where during the use of the device, the series of pixel lenses is the only component of the device with optical power between the light-emitting surface and a user's eye.
[17]
17. Apparatus according to claim 16, in which the monitor assembly has a concave surface that confronts a user's eye when using the apparatus.
[18]
18. Display device mounted on the head comprising: (I) a frame adapted to be mounted on the user's head; and (11) an image display system supported by the frame; wherein: (a) the image display system comprises a light-emitting surface comprising a series of light-emitting pixels; (b) the apparatus comprises a series of pixel lenses in the spherical shape, a pixel lens in the spherical shape for each of the light emitting pixels, said series of pixel lenses in the spherical shape being 10 located between the series of light emitting pixels and an eye of the user when using the device.
[19]
19. Method comprising the steps of: generating an image by an image display system having a light emitting surface that comprises a series of light emitting pixels; independently collimating and substantially collimating light from each respective light emitting pixel in the series of light emitting pixels by a respective pixel lens from a series of pixel lenses aligned with the series of light emitting pixels ; providing collimated or substantially collimated light from the series of pixel lenses to a reflector positioned with respect to a user's eye; and reflect collimated or substantially collimated light from the reflector to the user's eye.
[20]
20. The method of claim 19, wherein the reflector comprises a beam splitter and the method further comprises passing external light through the reflector to provide a view of an external environment for the reflector to the eye of the user.
[21]
21. Method comprising the steps of: (a) producing light from a series of light emitting pixels; (b) receiving the light produced by the series of light-emitting pixels in a series of pixel lenses positioned so that the light from each light-emitting pixel is collimated or substantially collimated by a corresponding pixel lens in the series pixel lenses; and (c) providing collimated or substantially collimated light directly to a user's eye.
类似技术:
公开号 | 公开日 | 专利标题
BR112013014975A2|2020-08-11|collimation display with pixel lenses
US10495790B2|2019-12-03|Head-mounted display apparatus employing one or more Fresnel lenses
US8625200B2|2014-01-07|Head-mounted display apparatus employing one or more reflective optical surfaces
CA2815461C|2019-04-30|Head-mounted display apparatus employing one or more fresnel lenses
KR101928764B1|2018-12-13|Head-mounted display apparatus employing one or more reflective optical surfaces
TWI553344B|2016-10-11|Head-mounted display apparatus employing one or more fresnel lenses
TWI559034B|2016-11-21|Head-mounted display apparatus employing one or more reflective optical surfaces
US9454007B1|2016-09-27|Free-space lens design and lenses therefrom
AU2015249168B2|2016-11-17|Collimating display with pixel lenses
BR112013009855B1|2021-11-09|HEAD MOUNTED DISPLAY DEVICE
同族专利:
公开号 | 公开日
US9720228B2|2017-08-01|
CA2821401C|2019-04-30|
EP2652542B1|2019-12-11|
CN103348278A|2013-10-09|
MX2013006722A|2014-01-31|
US20120154920A1|2012-06-21|
KR20140018209A|2014-02-12|
JP2014505271A|2014-02-27|
KR101883221B1|2018-08-30|
CA2821401A1|2012-06-21|
WO2012083042A1|2012-06-21|
EP2652542A1|2013-10-23|
AU2011343660A1|2013-07-04|
JP6246592B2|2017-12-13|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US5982343A|1903-11-29|1999-11-09|Olympus Optical Co., Ltd.|Visual display apparatus|
US3880509A|1974-03-15|1975-04-29|Us Navy|Wide-angle on-axis projection system|
US4026641A|1975-12-30|1977-05-31|The United States Of America As Represented By The Secretary Of The Army|Toric reflector display|
US4176468A|1978-06-22|1979-12-04|Marty William B Jr|Cockpit display simulator for electronic countermeasure training|
JPS6212483B2|1979-06-09|1987-03-19|Kenichi Matsuda|
US4406532A|1980-03-25|1983-09-27|Howlett Eric M|Wide angle color photography method and system|
JP2789195B2|1988-07-21|1998-08-20|大日本印刷株式会社|Manufacturing method of plastic sheet|
FR2662894B1|1990-06-01|1995-11-17|Thomson Csf|DEVICE FOR VIEWING SIMULATED IMAGES FOR A HELMET.|
AT180578T|1992-03-13|1999-06-15|Kopin Corp|DISPLAY DEVICE ON THE HEAD|
JPH06509185A|1991-07-03|1994-10-13|
DE69221987T2|1991-11-01|1998-02-05|Sega Enterprises Kk|Imaging device attached to the head|
EP0566001B1|1992-04-07|1999-07-14|Raytheon Company|Wide spectral bandwidth virtual image display optical system|
US5325386A|1992-04-21|1994-06-28|Bandgap Technology Corporation|Vertical-cavity surface emitting laser assay display system|
JP3155335B2|1992-04-24|2001-04-09|オリンパス光学工業株式会社|Visual display device|
US5572343A|1992-05-26|1996-11-05|Olympus Optical Co., Ltd.|Visual display having see-through function and stacked liquid crystal shutters of opposite viewing angle directions|
US5561538A|1992-11-17|1996-10-01|Sharp Kabushiki Kaisha|Direct-view display apparatus|
US5309169A|1993-02-01|1994-05-03|Honeywell Inc.|Visor display with fiber optic faceplate correction|
US5757544A|1993-03-09|1998-05-26|Olympus Optical Co., Ltd.|Image display apparatus|
US5388990A|1993-04-23|1995-02-14|The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration|Virtual reality flight control display with six-degree-of-freedom controller and spherical orientation overlay|
US5347400A|1993-05-06|1994-09-13|Ken Hunter|Optical system for virtual reality helmet|
US7310072B2|1993-10-22|2007-12-18|Kopin Corporation|Portable communication display device|
JP3212204B2|1993-11-11|2001-09-25|オリンパス光学工業株式会社|Visual display device|
US6160666A|1994-02-07|2000-12-12|I-O Display Systems Llc|Personal visual display system|
JP3207036B2|1994-02-15|2001-09-10|株式会社小糸製作所|Optical design method of lamp using light emitting element|
JP3430615B2|1994-03-04|2003-07-28|三菱電機株式会社|Eyepiece image display device|
US5714967A|1994-05-16|1998-02-03|Olympus Optical Co., Ltd.|Head-mounted or face-mounted image display apparatus with an increased exit pupil|
US5803738A|1994-06-24|1998-09-08|Cgsd Corporation|Apparatus for robotic force simulation|
US5483307A|1994-09-29|1996-01-09|Texas Instruments, Inc.|Wide field of view head-mounted display|
US5581271A|1994-12-05|1996-12-03|Hughes Aircraft Company|Head mounted visual display|
TW275590B|1994-12-09|1996-05-11|Sega Enterprises Kk|Head mounted display and system for use therefor|
US5798738A|1995-03-25|1998-08-25|Ricoh Company, Ltd.|Printing manager system for a copier in a network|
JPH08278476A|1995-04-07|1996-10-22|Omron Corp|Liquid crystal display panel and head mount display|
JP3599828B2|1995-05-18|2004-12-08|オリンパス株式会社|Optical device|
JPH09105885A|1995-10-12|1997-04-22|Canon Inc|Head mount type stereoscopic image display device|
JPH09113842A|1995-10-17|1997-05-02|Olympus Optical Co Ltd|Head or face mount type video display device|
WO1997022964A1|1995-12-18|1997-06-26|Bell Communications Research, Inc.|Flat virtual displays for virtual reality|
US5641288A|1996-01-11|1997-06-24|Zaenglein, Jr.; William G.|Shooting simulating process and training device using a virtual reality display screen|
JPH09219832A|1996-02-13|1997-08-19|Olympus Optical Co Ltd|Image display|
JP3761957B2|1996-02-15|2006-03-29|キヤノン株式会社|Reflective optical system and imaging apparatus using the same|
WO1997035223A1|1996-03-15|1997-09-25|Retinal Display Cayman Limited|Method of and apparatus for viewing an image|
US5701132A|1996-03-29|1997-12-23|University Of Washington|Virtual retinal display with expanded exit pupil|
US6592222B2|1996-07-31|2003-07-15|Massengill Family Trust|Flicker and frequency doubling in virtual reality|
US5834676A|1996-08-12|1998-11-10|Sight Unseen|Weapon-mounted location-monitoring apparatus|
US6215593B1|1996-11-13|2001-04-10|Ian A. Bruce|Portable wide-field optical system with microlenses and fiber-optic image transfer element|
US5715094A|1996-12-03|1998-02-03|Hughes Electronics|Lensless helmet/head mounted display|
JP3943680B2|1997-01-06|2007-07-11|オリンパス株式会社|Video display device|
JPH10206786A|1997-01-17|1998-08-07|Sanyo Electric Co Ltd|Head-mounted image display device|
JPH1080575A|1997-05-02|1998-03-31|Sega Enterp Ltd|Game device|
JP3716625B2|1997-09-18|2005-11-16|コニカミノルタホールディングス株式会社|Video observation apparatus, camera, and video observation system|
US6829087B2|1998-04-15|2004-12-07|Bright View Technologies, Inc.|Micro-lens array based light transmitting screen with tunable gain|
JP2000047138A|1998-07-27|2000-02-18|Mr System Kenkyusho:Kk|Image display device|
US6140979A|1998-08-05|2000-10-31|Microvision, Inc.|Scanned display with pinch, timing, and distortion correction|
FR2784201B1|1998-10-06|2003-01-31|Sextant Avionique|OPTICAL DEVICE FOR A HELMET SIGHT COMPRISING A DIFFRACTIVE MIRROR|
JP2000199853A|1998-10-26|2000-07-18|Olympus Optical Co Ltd|Image-formation optical system and observation optical system|
US7324081B2|1999-03-02|2008-01-29|Siemens Aktiengesellschaft|Augmented-reality system for situation-related support of the interaction between a user and an engineering apparatus|
US7110013B2|2000-03-15|2006-09-19|Information Decision Technology|Augmented reality display integrated with self-contained breathing apparatus|
US7158096B1|1999-06-21|2007-01-02|The Microoptical Corporation|Compact, head-mountable display device with suspended eyepiece assembly|
US6445362B1|1999-08-05|2002-09-03|Microvision, Inc.|Scanned display with variation compensation|
JP4573393B2|2000-01-06|2010-11-04|オリンパス株式会社|Image display device|
US20010033401A1|2000-03-17|2001-10-25|Minolta Co., Ltd.|Information display device|
US6813085B2|2000-06-26|2004-11-02|Angus Duncan Richards|Virtual reality display device|
US20020094189A1|2000-07-26|2002-07-18|Nassir Navab|Method and system for E-commerce video editing|
US6611253B1|2000-09-19|2003-08-26|Harel Cohen|Virtual input environment|
KR20020025301A|2000-09-28|2002-04-04|오길록|Apparatus and Method for Furnishing Augmented-Reality Graphic using Panoramic Image with Supporting Multiuser|
JP3406965B2|2000-11-24|2003-05-19|キヤノン株式会社|Mixed reality presentation device and control method thereof|
US6919866B2|2001-02-06|2005-07-19|International Business Machines Corporation|Vehicular navigation system|
JP2002258208A|2001-03-01|2002-09-11|Mixed Reality Systems Laboratory Inc|Optical element and composite display device utilizing it|
JP2002287077A|2001-03-23|2002-10-03|Nikon Corp|Video display device|
US6919867B2|2001-03-29|2005-07-19|Siemens Corporate Research, Inc.|Method and apparatus for augmented reality visualization|
US6529331B2|2001-04-20|2003-03-04|Johns Hopkins University|Head mounted display with full field of view and high resolution|
US6771423B2|2001-05-07|2004-08-03|Richard Geist|Head-mounted virtual display apparatus with a near-eye light deflecting element in the peripheral field of view|
US6731434B1|2001-05-23|2004-05-04|University Of Central Florida|Compact lens assembly for the teleportal augmented reality system|
US7009773B2|2001-05-23|2006-03-07|Research Foundation Of The University Of Central Florida, Inc.|Compact microlenslet arrays imager|
DE10127367A1|2001-06-06|2002-12-12|Klaus Dietrich|System to project detailed high resolution images onto the surface of pilot's helmet and are superimposed on general field|
US20020186179A1|2001-06-07|2002-12-12|Knowles Gary R.|Optical display device|
US6522474B2|2001-06-11|2003-02-18|Eastman Kodak Company|Head-mounted optical apparatus for stereoscopic display|
DE10131720B4|2001-06-30|2017-02-23|Robert Bosch Gmbh|Head-Up Display System and Procedures|
JP4751534B2|2001-07-24|2011-08-17|大日本印刷株式会社|Optical system and apparatus using the same|
US20070132785A1|2005-03-29|2007-06-14|Ebersole John F Jr|Platform for immersive gaming|
US7072096B2|2001-12-14|2006-07-04|Digital Optics International, Corporation|Uniform illumination system|
GB2387920B|2002-04-24|2005-11-23|Seos Ltd|An eyepiece for viewing a flat image and comprising a cemented doublet of reflecting and refracting optical components|
US20040008157A1|2002-06-26|2004-01-15|Brubaker Curtis M.|Cap-mounted monocular video/audio display|
JP3755036B2|2002-09-02|2006-03-15|国立大学法人大阪大学|Wide viewing angle head mounted display device|
EP1544666A4|2002-09-24|2010-11-24|Kenji Nishi|Image display unit and projection optical system|
US7002551B2|2002-09-25|2006-02-21|Hrl Laboratories, Llc|Optical see-through augmented reality modified-scale display|
KR100484174B1|2002-11-06|2005-04-18|삼성전자주식회사|Head mounted display|
US20040130783A1|2002-12-02|2004-07-08|Solomon Dennis J|Visual display with full accommodation|
KR100477684B1|2002-12-05|2005-03-21|삼성전자주식회사|Head mounted display|
US7432879B2|2003-02-10|2008-10-07|Schonlau William J|Personal viewer|
US7119965B1|2003-02-24|2006-10-10|University Of Central Florida Research Foundation, Inc.|Head mounted projection display with a wide field of view|
US7063256B2|2003-03-04|2006-06-20|United Parcel Service Of America|Item tracking and processing systems and methods|
DE10316533A1|2003-04-10|2004-11-04|Carl Zeiss|Head-mounted display, has two sub-lens units including diffractive lens units for compensating for their dispersion errors|
JP4532856B2|2003-07-08|2010-08-25|キヤノン株式会社|Position and orientation measurement method and apparatus|
CA2449982A1|2003-07-16|2005-01-16|Aurora Digital Advertising Inc.|Three dimensional display method, system and apparatus|
ITTO20030640A1|2003-08-19|2005-02-20|Luigi Giubbolini|MAN INTERFACE SYSTEM - MACHINE USING|
ITTO20030662A1|2003-08-29|2005-02-28|Fiat Ricerche|VIRTUAL VISUALIZATION ARRANGEMENT FOR A FRAMEWORK|
IL157837A|2003-09-10|2012-12-31|Yaakov Amitai|Substrate-guided optical device particularly for three-dimensional displays|
IL157838A|2003-09-10|2013-05-30|Yaakov Amitai|High brightness optical device|
US8884845B2|2003-10-28|2014-11-11|Semiconductor Energy Laboratory Co., Ltd.|Display device and telecommunication system|
JP4364002B2|2004-02-06|2009-11-11|オリンパス株式会社|Head-mounted camera and photographing method using head-mounted camera|
TWI244318B|2004-05-04|2005-11-21|Universal Vision Biotechnology|Focus adjustable head mounted display system and method and device for realizing the system|
WO2006093511A2|2004-06-10|2006-09-08|Bae Systems Information And Electronic Systems Integration Inc.|Method and apparatus for detecting sources of projectiles|
JP4370997B2|2004-07-29|2009-11-25|株式会社島津製作所|Head-mounted display device|
AU2005269256B2|2004-08-03|2008-08-07|Silverbrook Research Pty Ltd|Head mounted display with wave front modulator|
US20070020587A1|2004-08-05|2007-01-25|Seymore Michael Z|Interactive motion simulator|
WO2006023647A1|2004-08-18|2006-03-02|Sarnoff Corporation|Systeme and method for monitoring training environment|
US7545571B2|2004-09-08|2009-06-09|Concurrent Technologies Corporation|Wearable display system|
JP2006091477A|2004-09-24|2006-04-06|Konica Minolta Holdings Inc|Wide-angle observation optical system equipped with holographic reflection surface|
US7619825B1|2004-09-27|2009-11-17|Rockwell Collins, Inc.|Compact head up display with wide viewing angle|
TWI263831B|2004-09-30|2006-10-11|Himax Tech Inc|Head mounted display|
JP4404738B2|2004-10-05|2010-01-27|矢崎総業株式会社|Head-up display device|
US20060103590A1|2004-10-21|2006-05-18|Avner Divon|Augmented display system and methods|
US8885139B2|2005-01-21|2014-11-11|Johnson & Johnson Vision Care|Adaptive electro-active lens with variable focal length|
NZ537849A|2005-01-21|2007-09-28|Peter James Hilton|Direct Retinal Display projecting a scanned optical beam via diverging and converging reflectors|
US7812815B2|2005-01-25|2010-10-12|The Broad of Trustees of the University of Illinois|Compact haptic and augmented virtual reality system|
EP1846796A1|2005-02-10|2007-10-24|Lumus Ltd|Substrate-guided optical device particularly for vision enhanced optical systems|
US8140197B2|2005-02-17|2012-03-20|Lumus Ltd.|Personal navigation system|
JP4752309B2|2005-04-07|2011-08-17|ソニー株式会社|Image display apparatus and method|
US8040361B2|2005-04-11|2011-10-18|Systems Technology, Inc.|Systems and methods for combining virtual and real-time physical environments|
US7766515B2|2005-04-20|2010-08-03|Dragonfish Technologies, Llc|Light source with non-imaging optical distribution apparatus|
US7151639B2|2005-05-11|2006-12-19|Everspring Industry, Co., Ltd.|Thin-type spherical lens|
US20060281061A1|2005-06-13|2006-12-14|Tgds, Inc.|Sports Training Simulation System and Associated Methods|
JP2007086500A|2005-09-22|2007-04-05|Sony Corp|Display device|
US7486341B2|2005-11-03|2009-02-03|University Of Central Florida Research Foundation, Inc.|Head mounted display with eye accommodation having 3-D image producing system consisting of, for each eye, one single planar display screen, one single planar tunable focus LC micro-lens array, one single planar black mask and bias lens|
EP1946179B1|2005-11-10|2012-12-05|BAE Systems PLC|Method of modifying a display apparatus|
US8280405B2|2005-12-29|2012-10-02|Aechelon Technology, Inc.|Location based wireless collaborative environment with a visual user interface|
US20070177275A1|2006-01-04|2007-08-02|Optical Research Associates|Personal Display Using an Off-Axis Illuminator|
US7732694B2|2006-02-03|2010-06-08|Outland Research, Llc|Portable music player with synchronized transmissive visual overlays|
US7499217B2|2006-03-03|2009-03-03|University Of Central Florida Research Foundation, Inc.|Imaging systems for eyeglass-based display devices|
IL174170A|2006-03-08|2015-02-26|Abraham Aharoni|Device and method for binocular alignment|
CN100462984C|2006-03-17|2009-02-18|清华大学|Freeform curved surface reflector design system and method thereof|
US20070243916A1|2006-04-14|2007-10-18|Lee Ren E|Objective oriented reality horror survival game|
US8201436B2|2006-04-28|2012-06-19|Nokia Corporation|Calibration|
US7548697B2|2006-05-12|2009-06-16|Edison Hudson|Method and device for controlling a remote vehicle|
SE0601216L|2006-05-31|2007-12-01|Abb Technology Ltd|Virtual workplace|
US7473020B2|2006-07-07|2009-01-06|William Pickering|Light emitting diode display system|
US7965868B2|2006-07-20|2011-06-21|Lawrence Livermore National Security, Llc|System and method for bullet tracking and shooter localization|
KR100809479B1|2006-07-27|2008-03-03|한국전자통신연구원|Face mounted display apparatus and method for mixed reality environment|
JP4835327B2|2006-08-30|2011-12-14|コニカミノルタホールディングス株式会社|Image display device and head-mounted image display device|
US7735998B2|2006-10-25|2010-06-15|Volk Donald A|Multi-layered multifocal lens with blended refractive index|
MX2009004327A|2006-10-25|2009-11-13|Donald A Volk|Multi-layered multifocal lens with blended refractive index.|
US7547101B2|2007-01-02|2009-06-16|Hind-Sight Industries, Inc.|Eyeglasses with integrated telescoping video display|
US20090040308A1|2007-01-15|2009-02-12|Igor Temovskiy|Image orientation correction method and system|
US8259239B2|2007-01-18|2012-09-04|The Arizona Board Of Regents On Behalf Of The University Of Arizona|Polarized head-mounted projection display|
US20080198459A1|2007-01-29|2008-08-21|Fergason Patent Properties, Llc|Conjugate optics projection display system and method having improved resolution|
US20100279255A1|2007-02-16|2010-11-04|Ohio University|Vehicle simulator system|
US7762683B2|2007-02-23|2010-07-27|Raytheon Company|Optical device with tilt and power microlenses|
DE102007009828A1|2007-02-28|2008-09-04|Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V.|Image recording device for use in vacuum gripper, for use in robot with robot arm, has superimposing device, visibility aiding device for generating image signals, where video spectacle has transparent visual elements|
IL183637A|2007-06-04|2013-06-27|Zvi Lapidot|Distributed head-mounted display|
US20080309586A1|2007-06-13|2008-12-18|Anthony Vitale|Viewing System for Augmented Reality Head Mounted Display|
US8051597B1|2007-06-14|2011-11-08|Cubic Corporation|Scout sniper observation scope|
US20090002574A1|2007-06-29|2009-01-01|Samsung Electronics Co., Ltd.|Method and a system for optical design and an imaging device using an optical element with optical aberrations|
JP5145832B2|2007-09-12|2013-02-20|株式会社島津製作所|Head-mounted display device and head-mounted display device system|
JP5216761B2|2007-09-26|2013-06-19|パナソニック株式会社|Beam scanning display device|
US20090228251A1|2007-11-08|2009-09-10|University Of Central Florida Research Foundation, Inc.|Systems and Methods for Designing Optical Surfaces|
JP2009128565A|2007-11-22|2009-06-11|Toshiba Corp|Display device, display method and head-up display|
US7696919B2|2008-01-03|2010-04-13|Lockheed Martin Corporation|Bullet approach warning system and method|
US8025230B2|2008-01-04|2011-09-27|Lockheed Martin Corporation|System and method for prioritizing visually aimed threats for laser-based countermeasure engagement|
WO2009094643A2|2008-01-26|2009-07-30|Deering Michael F|Systems using eye mounted displays|
US7928927B1|2008-03-17|2011-04-19|Rockwell Collins, Inc.|Head worn head up display system|
JP2009232133A|2008-03-24|2009-10-08|Nikon Corp|Portable terminal|
US20100149073A1|2008-11-02|2010-06-17|David Chaum|Near to Eye Display System and Appliance|
JP2010020065A|2008-07-10|2010-01-28|Olympus Corp|Image display apparatus|
GB2461907A|2008-07-17|2010-01-20|Sharp Kk|Angularly restricted display|
US8437223B2|2008-07-28|2013-05-07|Raytheon Bbn Technologies Corp.|System and methods for detecting shooter locations from an aircraft|
US7663793B1|2008-07-31|2010-02-16|Institut National D'optique|Wide angle immersive display system|
JP2012501020A|2008-08-25|2012-01-12|ウニヴェルジテートチューリッヒプロレクトラートエムエヌヴェー|Adjustable virtual reality system|
JPWO2010047212A1|2008-10-20|2012-03-22|コニカミノルタオプト株式会社|Video display device|
US9480919B2|2008-10-24|2016-11-01|Excalibur Ip, Llc|Reconfiguring reality using a reality overlay device|
US9600067B2|2008-10-27|2017-03-21|Sri International|System and method for generating a mixed reality environment|
WO2010075634A1|2008-12-30|2010-07-08|Karen Collins|Method and system for visual representation of sound|
US20100238161A1|2009-03-19|2010-09-23|Kenneth Varga|Computer-aided system for 360º heads up display of safety/mission critical data|
US8059342B2|2009-04-03|2011-11-15|Vuzix Corporation|Beam segmentor for enlarging viewing aperture of microdisplay|
WO2010123934A1|2009-04-20|2010-10-28|The Arizona Board Of Regents On Behalf Of The University Of Arizona|Optical see-through free-form head-mounted display|
WO2010127285A2|2009-04-30|2010-11-04|Tetracam, Inc.|Method and apparatus for providing a 3d image via media device|
JP5402293B2|2009-06-22|2014-01-29|ソニー株式会社|Head-mounted display and image display method in head-mounted display|
GB2474007A|2009-08-27|2011-04-06|Simon R Daniel|Communication in and monitoring of a disaster area, optionally including a disaster medical pack|
JP2011059444A|2009-09-10|2011-03-24|Olympus Corp|Spectacles-type image display device|
US8320217B1|2009-10-01|2012-11-27|Raytheon Bbn Technologies Corp.|Systems and methods for disambiguating shooter locations with shockwave-only location|
JP2011133633A|2009-12-24|2011-07-07|Olympus Corp|Visual display device|
US20110214082A1|2010-02-28|2011-09-01|Osterhout Group, Inc.|Projection triggering through an external marker in an augmented reality eyepiece|
US20110213664A1|2010-02-28|2011-09-01|Osterhout Group, Inc.|Local advertising content on an interactive head-mounted eyepiece|
US8488246B2|2010-02-28|2013-07-16|Osterhout Group, Inc.|See-through near-eye display glasses including a curved polarizing film in the image source, a partially reflective, partially transmitting optical element and an optically flat film|
US8964298B2|2010-02-28|2015-02-24|Microsoft Corporation|Video display modification based on sensor input for a see-through near-to-eye display|
AU2011220382A1|2010-02-28|2012-10-18|Microsoft Corporation|Local advertising content on an interactive head-mounted eyepiece|
GB2478738A|2010-03-16|2011-09-21|Belron Hungary Kft Zug Branch|Eye level display in communication with electronic device|
US20110250962A1|2010-04-09|2011-10-13|Feiner Steven K|System and method for a 3d computer game with true vectorof gravity|
US20120050144A1|2010-08-26|2012-03-01|Clayton Richard Morlock|Wearable augmented reality computing apparatus|
US8941559B2|2010-09-21|2015-01-27|Microsoft Corporation|Opacity filter for display device|
US8781794B2|2010-10-21|2014-07-15|Lockheed Martin Corporation|Methods and systems for creating free space reflective optical surfaces|
US8625200B2|2010-10-21|2014-01-07|Lockheed Martin Corporation|Head-mounted display apparatus employing one or more reflective optical surfaces|
US9632315B2|2010-10-21|2017-04-25|Lockheed Martin Corporation|Head-mounted display apparatus employing one or more fresnel lenses|
US8678282B1|2010-11-29|2014-03-25|Lockheed Martin Corporation|Aim assist head-mounted display apparatus|
US20120326948A1|2011-06-22|2012-12-27|Microsoft Corporation|Environmental-light filter for see-through head-mounted display device|
JP5370427B2|2011-07-24|2013-12-18|株式会社デンソー|Head-up display device|
CA2750287C|2011-08-29|2012-07-03|Microsoft Corporation|Gaze detection in a see-through, near-eye, mixed reality display|
US9459457B2|2011-12-01|2016-10-04|Seebright Inc.|Head mounted display with remote control|
US8970960B2|2011-12-22|2015-03-03|Mattel, Inc.|Augmented reality head gear|
ES2791722T3|2012-02-02|2020-11-05|Airbus Helicopters Espana Sa|Virtual mockup with haptic portable aid|
US9552673B2|2012-10-17|2017-01-24|Microsoft Technology Licensing, Llc|Grasping virtual objects in augmented reality|
EP2887123A1|2013-12-18|2015-06-24|Thomson Licensing|Optical see-through glass type display device and corresponding optical element|US10073264B2|2007-08-03|2018-09-11|Lumus Ltd.|Substrate-guide optical device|
US9965681B2|2008-12-16|2018-05-08|Osterhout Group, Inc.|Eye imaging in head worn computing|
US9250444B2|2009-11-21|2016-02-02|Immy Inc.|Head mounted display device|
US9632315B2|2010-10-21|2017-04-25|Lockheed Martin Corporation|Head-mounted display apparatus employing one or more fresnel lenses|
KR101928764B1|2010-12-28|2018-12-13|록히드 마틴 코포레이션|Head-mounted display apparatus employing one or more reflective optical surfaces|
US8781794B2|2010-10-21|2014-07-15|Lockheed Martin Corporation|Methods and systems for creating free space reflective optical surfaces|
US8625200B2|2010-10-21|2014-01-07|Lockheed Martin Corporation|Head-mounted display apparatus employing one or more reflective optical surfaces|
US10359545B2|2010-10-21|2019-07-23|Lockheed Martin Corporation|Fresnel lens with reduced draft facet visibility|
US8885882B1|2011-07-14|2014-11-11|The Research Foundation For The State University Of New York|Real time eye tracking for human computer interaction|
US9454007B1|2012-05-07|2016-09-27|Lockheed Martin Corporation|Free-space lens design and lenses therefrom|
US20130322683A1|2012-05-30|2013-12-05|Joel Jacobs|Customized head-mounted display device|
JP2014130218A|2012-12-28|2014-07-10|Japan Display Inc|Display device|
US20140267389A1|2013-03-14|2014-09-18|Exelis Inc.|Night Vision Display Overlaid with Sensor Data|
EP2972543A1|2013-03-15|2016-01-20|Immy Inc.|Head mounted display with micro-display alignment mechanism|
CN108132537B|2013-04-11|2021-03-16|索尼公司|Display device|
EP2985652A4|2013-04-11|2016-12-14|Sony Corp|Image display device and display device|
US20140333773A1|2013-05-11|2014-11-13|Randy James Davis|Portable audio/ video mask|
FR3006457B1|2013-05-30|2015-07-03|Commissariat Energie Atomique|IMAGE DISPLAY DEVICE IN INCREASED REALITY|
US20140362110A1|2013-06-08|2014-12-11|Sony Computer Entertainment Inc.|Systems and methods for customizing optical representation of views provided by a head mounted display based on optical prescription of a user|
US9488837B2|2013-06-28|2016-11-08|Microsoft Technology Licensing, Llc|Near eye display|
EP3023829A4|2013-07-16|2017-05-10|Sony Corporation|Display device|
KR102187843B1|2013-08-19|2020-12-07|삼성전자 주식회사|Method for changing screen in a user device terminal having pen|
US20160327798A1|2014-01-02|2016-11-10|Empire Technology Development Llc|Augmented realitysystem|
US9939934B2|2014-01-17|2018-04-10|Osterhout Group, Inc.|External user interface for head worn computing|
US10254856B2|2014-01-17|2019-04-09|Osterhout Group, Inc.|External user interface for head worn computing|
US9846308B2|2014-01-24|2017-12-19|Osterhout Group, Inc.|Haptic systems for head-worn computers|
US9594246B2|2014-01-21|2017-03-14|Osterhout Group, Inc.|See-through computer display systems|
US9753288B2|2014-01-21|2017-09-05|Osterhout Group, Inc.|See-through computer display systems|
US9715112B2|2014-01-21|2017-07-25|Osterhout Group, Inc.|Suppression of stray light in head worn computing|
US10191279B2|2014-03-17|2019-01-29|Osterhout Group, Inc.|Eye imaging in head worn computing|
US9651784B2|2014-01-21|2017-05-16|Osterhout Group, Inc.|See-through computer display systems|
US9529195B2|2014-01-21|2016-12-27|Osterhout Group, Inc.|See-through computer display systems|
US9684172B2|2014-12-03|2017-06-20|Osterhout Group, Inc.|Head worn computer display systems|
US9836122B2|2014-01-21|2017-12-05|Osterhout Group, Inc.|Eye glint imaging in see-through computer display systems|
US9952664B2|2014-01-21|2018-04-24|Osterhout Group, Inc.|Eye imaging in head worn computing|
US9615742B2|2014-01-21|2017-04-11|Osterhout Group, Inc.|Eye imaging in head worn computing|
US20150205135A1|2014-01-21|2015-07-23|Osterhout Group, Inc.|See-through computer display systems|
US9298007B2|2014-01-21|2016-03-29|Osterhout Group, Inc.|Eye imaging in head worn computing|
US9494800B2|2014-01-21|2016-11-15|Osterhout Group, Inc.|See-through computer display systems|
US20150241964A1|2014-02-11|2015-08-27|Osterhout Group, Inc.|Eye imaging in head worn computing|
US9651788B2|2014-01-21|2017-05-16|Osterhout Group, Inc.|See-through computer display systems|
US9766463B2|2014-01-21|2017-09-19|Osterhout Group, Inc.|See-through computer display systems|
US9811159B2|2014-01-21|2017-11-07|Osterhout Group, Inc.|Eye imaging in head worn computing|
US9400390B2|2014-01-24|2016-07-26|Osterhout Group, Inc.|Peripheral lighting for head worn computing|
JP6315240B2|2014-02-03|2018-04-25|株式会社リコー|Image display device, moving body, and lens array|
US9229233B2|2014-02-11|2016-01-05|Osterhout Group, Inc.|Micro Doppler presentations in head worn computing|
US9401540B2|2014-02-11|2016-07-26|Osterhout Group, Inc.|Spatial location presentation in head worn computing|
US9299194B2|2014-02-14|2016-03-29|Osterhout Group, Inc.|Secure sharing in head worn computing|
US20160187651A1|2014-03-28|2016-06-30|Osterhout Group, Inc.|Safety for a vehicle operator with an hmd|
US11227294B2|2014-04-03|2022-01-18|Mentor Acquisition One, Llc|Sight information collection in head worn computing|
CN104469440B|2014-04-16|2018-10-30|成都理想境界科技有限公司|Video broadcasting method, video player and corresponding playback equipment|
IL232197A|2014-04-23|2018-04-30|Lumus Ltd|Compact head-mounted display system|
US10853589B2|2014-04-25|2020-12-01|Mentor Acquisition One, Llc|Language translation with head-worn computing|
US9651787B2|2014-04-25|2017-05-16|Osterhout Group, Inc.|Speaker assembly for headworn computer|
US9672210B2|2014-04-25|2017-06-06|Osterhout Group, Inc.|Language translation with head-worn computing|
US9746686B2|2014-05-19|2017-08-29|Osterhout Group, Inc.|Content position calibration in head worn computing|
US9595138B2|2014-05-29|2017-03-14|Commissariat A L'energie Atomique Et Aux Energies Alternatives|Augmented reality display device|
US9841599B2|2014-06-05|2017-12-12|Osterhout Group, Inc.|Optical configurations for head-worn see-through displays|
US10649220B2|2014-06-09|2020-05-12|Mentor Acquisition One, Llc|Content presentation in head worn computing|
US10663740B2|2014-06-09|2020-05-26|Mentor Acquisition One, Llc|Content presentation in head worn computing|
US9575321B2|2014-06-09|2017-02-21|Osterhout Group, Inc.|Content presentation in head worn computing|
CN104253989B|2014-06-09|2018-05-18|黄石|Full multi-view image display device|
US20150362733A1|2014-06-13|2015-12-17|Zambala Lllp|Wearable head-mounted display and camera system with multiple modes|
US9810906B2|2014-06-17|2017-11-07|Osterhout Group, Inc.|External user interface for head worn computing|
US11103122B2|2014-07-15|2021-08-31|Mentor Acquisition One, Llc|Content presentation in head worn computing|
US20160019715A1|2014-07-15|2016-01-21|Osterhout Group, Inc.|Content presentation in head worn computing|
US9829707B2|2014-08-12|2017-11-28|Osterhout Group, Inc.|Measuring content brightness in head worn computing|
US9671613B2|2014-09-26|2017-06-06|Osterhout Group, Inc.|See-through computer display systems|
US10684476B2|2014-10-17|2020-06-16|Lockheed Martin Corporation|Head-wearable ultra-wide field of view display device|
US10684687B2|2014-12-03|2020-06-16|Mentor Acquisition One, Llc|See-through computer display systems|
CN104392706B|2014-12-16|2017-03-22|京东方科技集团股份有限公司|Correction method and correction device for curved surface displaying and curved surface display equipment|
USD751552S1|2014-12-31|2016-03-15|Osterhout Group, Inc.|Computer glasses|
USD753114S1|2015-01-05|2016-04-05|Osterhout Group, Inc.|Air mouse|
EP3243098A4|2015-01-06|2018-08-29|Vuzix Corporation|Head mounted imaging apparatus with curved lenslet array|
KR20170104604A|2015-01-21|2017-09-15|테세랜드 엘엘씨|Display with total internal reflection|
US20160239985A1|2015-02-17|2016-08-18|Osterhout Group, Inc.|See-through computer display systems|
US9939650B2|2015-03-02|2018-04-10|Lockheed Martin Corporation|Wearable display system|
WO2016163231A1|2015-04-09|2016-10-13|シャープ株式会社|Spectacle type display device|
CN104777616B|2015-04-27|2018-05-04|塔普翊海(上海)智能科技有限公司|Have an X-rayed wear-type light field display device|
US20160349509A1|2015-05-26|2016-12-01|Microsoft Technology Licensing, Llc|Mixed-reality headset|
US10268044B1|2015-05-29|2019-04-23|Lockheed Martin Corporation|Three-dimensionalimmersive viewer system|
US10368059B2|2015-10-02|2019-07-30|Atheer, Inc.|Method and apparatus for individualized three dimensional display calibration|
JP6662599B2|2015-10-05|2020-03-11|ミツミ電機株式会社|Display device|
US10754156B2|2015-10-20|2020-08-25|Lockheed Martin Corporation|Multiple-eye, single-display, ultrawide-field-of-view optical see-through augmented reality system|
US10147235B2|2015-12-10|2018-12-04|Microsoft Technology Licensing, Llc|AR display with adjustable stereo overlap zone|
WO2017113757A1|2015-12-31|2017-07-06|北京小鸟看看科技有限公司|Method of laying out surrounding interface, methods of switching content and switching list in three-dimensional immersive environment|
CN106959510A|2016-01-08|2017-07-18|京东方科技集团股份有限公司|A kind of display device and virtual reality glasses|
CN108700743A|2016-01-22|2018-10-23|康宁股份有限公司|Wide visual field individual's display|
US9995936B1|2016-04-29|2018-06-12|Lockheed Martin Corporation|Augmented reality systems having a virtual image overlaying an infrared portion of a live scene|
US10156723B2|2016-05-12|2018-12-18|Google Llc|Display pre-distortion methods and apparatus for head-mounted displays|
US10151927B2|2016-05-31|2018-12-11|Falcon's Treehouse, Llc|Virtual reality and augmented reality head set for ride vehicle|
US9927615B2|2016-07-25|2018-03-27|Qualcomm Incorporated|Compact augmented reality glasses with folded imaging optics|
CN106444023A|2016-08-29|2017-02-22|北京知境科技有限公司|Super-large field angle binocular stereoscopic display transmission type augmented reality system|
KR20190082916A|2016-11-16|2019-07-10|매직 립, 인코포레이티드|Multi-resolution display assembly for head-mounted display systems|
CN106504650B|2016-11-23|2020-03-06|京东方科技集团股份有限公司|Light source structure and display device|
JP6857800B2|2016-12-21|2021-04-14|パナソニックIpマネジメント株式会社|Virtual image display device|
WO2018127913A1|2017-01-04|2018-07-12|Lumus Ltd.|Optical system for near-eye displays|
CN110376742A|2017-03-23|2019-10-25|华为机器有限公司|Near-eye display and near-eye display system|
CN108881894B|2017-05-08|2020-01-17|华为技术有限公司|VR multimedia experience quality determination method and device|
US10795178B2|2017-05-09|2020-10-06|Amtran Technology Co., Ltd.|Device for mixed reality|
US10338400B2|2017-07-03|2019-07-02|Holovisions LLC|Augmented reality eyewear with VAPE or wear technology|
US10859834B2|2017-07-03|2020-12-08|Holovisions|Space-efficient optical structures for wide field-of-view augmented realityeyewear|
US10976551B2|2017-08-30|2021-04-13|Corning Incorporated|Wide field personal display device|
US10930709B2|2017-10-03|2021-02-23|Lockheed Martin Corporation|Stacked transparent pixel structures for image sensors|
TWI679555B|2017-10-12|2019-12-11|華碩電腦股份有限公司|Augmented reality system and method for providing augmented reality|
US10510812B2|2017-11-09|2019-12-17|Lockheed Martin Corporation|Display-integrated infrared emitter and sensor structures|
TWI677711B|2017-11-13|2019-11-21|財團法人金屬工業研究發展中心|System for augmented reality-image|
EP3711291A4|2017-11-14|2021-08-04|Forgetspecs.com Pty Ltd|Device and method for altering the vergence of light to improve human vision of an electronic display|
US10594951B2|2018-02-07|2020-03-17|Lockheed Martin Corporation|Distributed multi-aperture camera array|
US10838250B2|2018-02-07|2020-11-17|Lockheed Martin Corporation|Display assemblies with electronically emulated transparency|
US10129984B1|2018-02-07|2018-11-13|Lockheed Martin Corporation|Three-dimensional electronics distribution by geodesic faceting|
US10951883B2|2018-02-07|2021-03-16|Lockheed Martin Corporation|Distributed multi-screen array for high density display|
US10652529B2|2018-02-07|2020-05-12|Lockheed Martin Corporation|In-layer Signal processing|
US10979699B2|2018-02-07|2021-04-13|Lockheed Martin Corporation|Plenoptic cellular imaging system|
US10690910B2|2018-02-07|2020-06-23|Lockheed Martin Corporation|Plenoptic cellular vision correction|
US10678056B2|2018-02-26|2020-06-09|Google Llc|Augmented reality light field head-mounted displays|
US10373391B1|2018-04-23|2019-08-06|AbdurRahman Bin Shahzad Bhatti|Augmented reality system for fitness|
US10871653B1|2018-04-24|2020-12-22|Lc-Tec Displays Ab|Viewing direction independent single-layer, pixelated light dimming filter|
IL259518D0|2018-05-22|2018-06-28|Lumus Ltd|Optical system and method for improvement of light field uniformity|
US10969586B2|2018-07-10|2021-04-06|Darwin Hu|Ultra light-weight see-through display glasses|
US10778963B2|2018-08-10|2020-09-15|Valve Corporation|Head-mounted displaywith spatially-varying retarder optics|
US10996463B2|2018-08-10|2021-05-04|Valve Corporation|Head-mounted displaywith spatially-varying retarder optics|
KR20200034909A|2018-09-21|2020-04-01|삼성디스플레이 주식회사|Display device and method for manufacturing the same|
US10866413B2|2018-12-03|2020-12-15|Lockheed Martin Corporation|Eccentric incident luminance pupil tracking|
US10698201B1|2019-04-02|2020-06-30|Lockheed Martin Corporation|Plenoptic cellular axis redirection|
WO2021077850A1|2019-10-21|2021-04-29|华为技术有限公司|Display panel, near-eye display optical system, and head-mounted display device|
法律状态:
2020-08-25| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2020-09-01| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2021-06-29| B06A| Patent application procedure suspended [chapter 6.1 patent gazette]|
2021-10-13| B08F| Application dismissed because of non-payment of annual fees [chapter 8.6 patent gazette]|Free format text: REFERENTE A 10A ANUIDADE. |
2021-11-03| B350| Update of information on the portal [chapter 15.35 patent gazette]|
2022-02-01| B08K| Patent lapsed as no evidence of payment of the annual fee has been furnished to inpi [chapter 8.11 patent gazette]|Free format text: EM VIRTUDE DO ARQUIVAMENTO PUBLICADO NA RPI 2649 DE 13-10-2021 E CONSIDERANDO AUSENCIA DE MANIFESTACAO DENTRO DOS PRAZOS LEGAIS, INFORMO QUE CABE SER MANTIDO O ARQUIVAMENTO DO PEDIDO DE PATENTE, CONFORME O DISPOSTO NO ARTIGO 12, DA RESOLUCAO 113/2013. |
优先权:
申请号 | 申请日 | 专利标题
US42393410P| true| 2010-12-16|2010-12-16|
US61/423,934|2010-12-16|
US201061424162P| true| 2010-12-17|2010-12-17|
US201061424166P| true| 2010-12-17|2010-12-17|
US61/424,166|2010-12-17|
US61/424,162|2010-12-17|
PCT/US2011/065201|WO2012083042A1|2010-12-16|2011-12-15|Collimating display with pixel lenses|
[返回顶部]